Allo doesn’t offer default end-to-end encryption setting because it would disable Google Assistant

When Google unveiled Allo — their smart messaging app coming soon to Android and iOS — one of the more interesting features they revealed was end-to-end encryption. As we later learned, the technology powering Allo’s end-to-end encryption was built upon Signal Protocol, the same open-source protocol from Open Whisper Systems that WhatsApp currently uses.

We’ve known since the announcement that E2E encryption was a feature of Allo’s Incognito mode, but now Ars Technica has confirmed exactly why this is the case. Because Google Assistant is such a huge part of Allo, it simply wouldn’t be possible for Google to listen in on conversations and provide smart suggestions for restaurants, or quick replies.

This is after Thai Duong, the co-leader on Google’s product security team, made it known in a blog post that he wished Allo’s E2E encryption was enabled by default (outside of Incognito Mode) — not an option left up to the user. The sentiment was further echoed by Edward Snowden in a Twitter post, advising users to avoid using it for now.

It didn’t take long for Duong’s higher-uppers to get word and the blog post was promptly revised (several times, actually). Duong did mention that it would be possible for Google to add a default encryption option where Google Assistant would only work when messaged directly, but there’s currently no plans to add such a feature.

In the end, what it comes down to is whether the user values Google Assistant over the privacy of Incognito Mode. It’d be nice to have both, but for now it’s just one or the other.

Google engineer says he’ll push for default end-to-end encryption in Allo

Google engineer says he'll push for default end-to-end encryption in Allo

After Google’s decision not to provide end-to-end encryption by default in its new chat app, Allo, raised questions about the balance of security and effective artificial intelligence, one of the company’s top security engineers said he’d push for end-to-end encryption to become the default in future versions of Allo.

Allo debuted with an option to turn on end-to-end encryption, dubbed “incognito mode.” Google obviously takes security seriously, but had to compromise on strong encryption in Allo in order for its AI to work. (Allo messages are encrypted in transit and at rest.)

Thai Duong, an engineer who co-leads Google’s product security team, wrote in a blog post today that he’d push for end-to-end encryption in Allo — then quietly deleted two key paragraphs from his post. In the version he originally published, Duong wrote:

Google engineer says he'll push for default end-to-end encryption in Allo

These two paragraphs have been erased from the version of Duong’s post that is currently live.

This edit probably doesn’t mean that Duong won’t continue to lobby internally for end-to-end encryption — his job is to make Google’s products as secure as possible. But Google, like most major companies, is pretty cagey about revealing its plans for future products and likely didn’t want Duong to reveal on his personal blog what’s next for Allo.

Even without the paragraphs on end-to-end encryption, Duong’s post offers interesting insight into Google’s thinking as it planned to launch Allo. For users who care about the security of their messaging apps, Duong highlights that it’s not encryption that matters most to Allo, but rather the disappearing message feature.

“Most people focus on end-to-end encryption, but I think the best privacy feature of Allo is disappearing messaging,” Duong wrote. “This is what users actually need when it comes to privacy. Snapchat is popular because they know exactly what users want.”

Duong also confirmed the likely reason Google didn’t choose to enable end-to-end encryption in Allo by default: doing so would interfere with some of the cool AI features Allo offers. For users who don’t choose to enable end-to-end encryption, Allo will run AI that offers suggestions, books dinner reservations and buys movie tickets. But the AI won’t work if it can’t scan a user’s messages, and it gets locked out if the user enables end-to-end encryption.

We reached out to Google to ask if the company asked Duong to edit to his blog post and will update if we hear back. Duong stressed that the post only reflected his personal beliefs, not those of Google — and we hope his advocacy for a default incognito mode comes to fruition.

OSGP custom RC4 encryption cracked yet again

OSGP custom RC4 encryption cracked yet again

The Open Smart Grid Protocol’s (OSGP) home-grown RC4 encryption has been cracked once again. The easy-to-break custom RC4 was cracked last year.

A year ago, the OSGP Alliance advised that better security would be implemented, but the RC4 still remains according to German researchers Linus Feiten and Matthias Sauer.

Feiten and Sauer claim to have the ability to extract the secret key used in the OSGP’s RC4 stream cipher. “Our new method comprises the modification of a known attack exploiting biases in the RC4 cipher stream output to effectively calculate the secret encryption key. Once this secret key is obtained, it can be used to decrypt all intercepted data sent in an OSCP smart grid,” Sauer and Feiten explained in their research.

Decrypting the secret key can expose the energy consumption of an individual customer thus an attacker could create messages reporting incorrect information to the grid operator.

Grid operators waited on vendor support to protect their networks with the alliance’s OSGP-AES-128-PSK specification bit encryption released in July as it was described as a “new work proposal for standardisation purposes”.

John McAfee claims to have hacked WhatsApp’s encrypted messages, but the real story could be different

John McAfee claims to have hacked WhatsApp’s encrypted messages, but the real story could be different

Last month, WhatsApp enabled end-to-end encryption for its billion users to secure all the communications made between users — be it a group chat, voice calls, personal chats or the photos and videos that are being shared. While WhatsApp says it is difficult even for them to access the conversations, cybersecurity expert John McAfee and his team of four hackers claim to successfully read an encrypted WhatsApp message, Cybersecurity Ventures reports. While it sounds like a bold claim, the real story could be completely different.

John McAfee, the creator of one of the popular anti-virus software, apparently tried to trick the media in believing that he hacked the encryption used by WhatsApp, Gizmodo reports. To convince the reporters that he could read the encrypted conversations, McAfee is said to have sent two phones preinstalled with malware containing a keylogger.

According to Dan Guido, a cybersecurity expert who was contacted to verify the claim, McAfee sent two Samsung phones in sealed boxes to the reporter. The experts then took the phones out and exchanged a text on WhatsApp, which McAfee was able to read over a Skype call. Citing sources, the publication also reports that McAfee offered his story to a couple of big publications as well, which includes Russia Today and the International Business Times.

“John McAfee was offering to a different couple of news organizations to mail them some phones, have people show up, and then demonstrate with those two phones that [McAfee] in a remote location would be able to read the message as it was sent across the phones. I advised the reporter to go out and buy their own phones, because even though they come in a box it’s very easy to get some saran wrap and a hair dryer to rebox them,” Guido told the publication.

McAfee has a long history of being shifty, especially when it comes to his alleged cybersecurity exploits. For instance, earlier this year in March, he claimed to hack into San Bernadino terrorist Syed Farook’s phone, but he never managed to prove his claims right. Later on, McAfee admitted that he lied to get the public attention.

This time too McAfee seems to have lied to reporters to buy his story, but when reporters asked to verify the claim, he changed the story. Moxie Marlinspike, who developed and implemented the encryption tool in WhatsApp told the publication about McAfee admitting his plan.

“I talked to McAfee on the phone, he reluctantly told me that it was a malware thing with pre-cooked phones, and all the outlets he’d contacted decided not to cover it after he gave them details about how it’d work,” he said.

With McAfee’s claims turn out to be false, WhatsApp saying that it does not have the ‘key’ to decrypt communications sounds good so far. However, if at all, someday, someone manages to hack into the conversations, it could turn into havoc. While it will give the ability to monitor the conversations between terrorists, it could also breach the privacy of the users.

Legal effects of encryption bills discussed at dark web event

1

An attorney who has worked for the U.S. Army and the Central Intelligence Agency discussed attempts to regulate encryption technologies at the Inside Dark Web conference in New York City on Thursday.

“State legislative response may be un-Constitutional, because it would place a burden on interstate commerce,” said Blackstone Law Group partner Alexander Urbeis. “So they may, in fact, be a way to encourage the federal government to enact encryption legislation.” Several states, including California, Louisiana, and New York, have introduced encryption legislation recently.

California’s “Assembly Bill 1681,” which would have created a $2,500 penalty of phone manufacturers and operating system providers that leased or sold smartphones in the state for each instance in which they did not obey a court order to decrypt a phone, was defeated last month. A similar bill proposed in New York is currently in committee.

“The economic implications would outstrip the privacy implications,” Urbeis said, discussing the effects of the encryption bill sponsored by Sen. Dianne Feinstein (D-Calif.) and Senate Intelligence Committee Chairman Richard Burr (R-N.C.). “The economic implications of these legislation have not been fully thought through. They are obviously going to become very attractive targets for hackers, criminal groups.”

Urbeis also heads Black Chambers, an information security firm that protects legal privilege. Many law firms “have lost the confidence of clients to protect their data,” he said, discussing the reaction to the Panama Papers. “Law firms have been for a long time the soft underbelly of their clients,” he said.

American ISIS Recruits Down, but Encryption Is Helping Terrorists’Online Efforts, Says FBI Director

American ISIS Recruits Down, but Encryption Is Helping Terrorists'Online Efforts, Says FBI Director

The number of Americans traveling to the Middle East to fight alongside Islamic State has dropped, but the terrorist group’s efforts to radicalize people online is getting a major boost from encryption technology, FBI Director James Comey said Wednesday.

Since August, just one American a month has traveled or attempted to travel to the Middle East to join the group, compared with around six to 10 a month in the preceding year and a half, Mr. Comey told reporters in a round table meeting at FBI headquarters.

However, federal authorities have their hands full trying to counter Islamic State’s social media appeal. Of around 1,000 open FBI investigations into people who may have been radicalized across the U.S., about 80% are related to Islamic State, Mr. Comey said.

The increasing use of encrypted communications is complicating law enforcement’s efforts to protect national security, said Mr. Comey, calling the technology a “huge feature of terrorist tradecraft.”

The FBI director cited Facebook Inc.’s WhatsApp texting service, which last month launched end-to-end encryption in which only the sender and receiver are able to read the contents of messages.

“WhatsApp has over a billion customers—overwhelmingly good people but in that billion customers are terrorists and criminals,” Mr. Comey said. He predicted an inevitable “collision” between law enforcement and technology companies offering such services.

Silicon Valley leaders argue that stronger encryption is necessary to protect consumers from a variety of threats.

“While we recognize the important work of law enforcement in keeping people safe, efforts to weaken encryption risk exposing people’s information to abuse from cybercriminals, hackers and rogue states,” WhatsApp CEO Jan Koum wrote last month in a blog post accompanying the rollout of the stronger encryption technology. The company Wednesday declined to comment on Mr. Comey’s remarks.

The FBI also continues to face major challenges in unlocking phones used by criminals including terrorists, Mr. Comey said. Investigators have been unable to unlock around 500 of the 4,000 or so devices the FBI has examined in the first six month of this fiscal year, which began Oct. 1, he said.

“I expect that number just to grow as the prevalence of the technology grows with newer models,” Mr. Comey added.

A terrorist’s locked iPhone recently sparked a high-stakes legal battle between the Justice Department and Apple Inc.
After Syed Rizwan Farook and his wife killed 14 people and wounded 22 in a December shooting rampage in San Bernardino, Calif., FBI agents couldn’t unlock the phone of Mr. Farook—who, along with his wife, was killed later that day in a shootout with police.

The government tried to force Apple to write software to open the device, but the technology company resisted, saying that such an action could compromise the security of millions of other phones.

That court case came to an abrupt end in March, when the FBI said it no longer needed Apple’s help because an unidentified third party had shown it a way to bypass the phone’s security features.

Users’interest should drive encryption policy: IAMAI

Users'interest should drive encryption policy: IAMAI

Encryption is a fundamental and necessary tool to safeguard digital communication infrastructure but the interests of Internet users should be foremost in framing any policy, the Internet and Mobile Association of India (IAMAI) said here on Tuesday.

“Trust, convenience and confidence of users are the keywords to designing an ideal encryption policy that will help in getting more people online with safe and secured internet platforms,” said IAMAI president Subho Ray.

The association, which has published a discussion paper on encryption policy, suggests that a broad-based public consultation with all stakeholders including users groups should precede making of an encryption policy.

According to the paper, the foundation of a user centric encryption policy consists of freedom of encryption, strong encryption base standard, no plaintext storage and mandatory legal monitoring or no backdoor entry.

An essential element in the suggestion that support for strong encryption is critical to counter cyber security issues around the globe, but also pitches for the importance of freedom of encryption for the users, organisations and business entities.

Encryption; Friend of Freedom, Guardian of Privacy

The issue of government access to private encrypted data has been in the public eye since the San Bernardino shootings in December, 2015. When an iPhone was found the FBI requested that Apple write code to override the phone’s security features. The FBI was ultimately able to decrypt the phone without Apple’s assistance. However, the ensuing debate over encryption has just begun.

High profile criminal and national security issues serve to shed light on an issue which is pervasive throughout the country. Local governments presumably have thousands of devices they would like to decrypt for investigatory purposes as New York City alone has hundreds. Seeking a resolution and remembering the horrific terror attacks of September 11, 2001 New York State Assembly Bill A8093A is in committee and seeks to outlaw the sale of phones in the state which have encryption not by passable to law enforcement.

Encryption allows for the safe keeping and targeting dissemination of private thoughts and information without worry off judgment, retaliation or mistreatment. On a grander scale encryption prevents unchecked government oversight. It can be argued that encryption technology is a hedge against current and future totalitarian regimes. With a history of occupation and abuse of power it is no surprise that Germany and France are not pushing for encryption backdoors.

Backdoors in encrypted devises and software provide another avenue for unwelcome parties to gain access. Hackers are often intelligent, well-funded and act on their own, in groups and most harmfully with foreign entities. Holes have a way of being found and master keys have a way of being lost.

Senators Richard Burr and Diane Feinstein are undoubtedly well intended with their draft law entitled the Compliance with Court Orders Act of 2016. The act calls for providers of communication services including software publishers to decrypt data when served with a court order. The data would have to be provided in an intelligible format or alternatively technical assistance for its retrieval. Prosecutors have a need to gather evidence. Governments have a duty to prevent crime and acts of terror.

However, experts question the feasibility of building backdoors into all types of encryption as it comes in many forms and from a host of global providers. Further, there is concern that the measure, if adopted, will backfire as the targeting of backdoors by our adversaries is assured. Cyberwar in the form of illicit data collection, theft of trade secrets and access to infrastructure is all too common and may escalate as tensions rise between adversaries. Ransomware and cyber extortion have been spreading, most recently at hospitals, and the knowledge of the existence of backdoors will motivate those who seek unseemly profits.

Efforts to prosecute the accused, fight crime and terror are noble causes. However, government should be wise in the approach lest we weaken our shared defenses in the process. The big corporate names of Silicon Valley recognize the dangers of backdoors and are speaking out and lobbying against Senator Burr and Feinstein’s efforts. The draft legislation does ensure that the monetary cost of decrypting is paid to the, “covered entity.” However, the costs to society at large remain up for discussion.

The encryption challenge

The encryption challenge

IT managers know the movies get it wrong. A teenager with a laptop cannot crack multiple layers of encryption — unless that laptop is connected to a supercomputer somewhere and the teenager can afford to wait a few billion years.

Encryption works. It works so well that even the government gets stymied, as demonstrated by the lengths to which the FBI went to access an iPhone used by one of the San Bernardino, Calif., shooters.

So in the face of ever more damaging stories about data breaches, why aren’t all government agencies encrypting everything, everywhere, all the time?

Encryption can be costly and time consuming. It can also be sabotaged by users and difficult to integrate with legacy applications.

Furthermore, according to a recent 451 Research survey of senior security executives, government agencies seem to be fighting the previous war. Instead of protecting data from hackers who’ve already gotten in, they’re still focusing on keeping the bad guys out of their systems.

Among U.S. government respondents, the top category for increased spending in the next 12 months was network defenses — at 53 percent. By comparison, spending for data-at-rest defenses such as encryption ranked dead last, with just 37 percent planning to increase their spending.

Part of the reason for those figures is that government agencies overestimate the benefits of perimeter defenses. Sixty percent said network defenses were “very” effective, a higher percentage than any other category, while government respondents ranked data-at-rest defenses as less effective than respondents in any other category.

There was a time when that attitude made sense. “Organizations used to say that they wouldn’t encrypt data in their data centers because they’re behind solid walls and require a [password] to get in,” said Steve Pate, chief architect at security firm HyTrust.

That attitude, however, runs counter to the modern reality that there is no longer a perimeter to protect. Every organization uses third-party service providers, offers mobile access or connects to the web — or a combination of all three.

A security audit at the Office of Personnel Management, for example, showed that use of multifactor authentication, such as the government’s own personal identity verification card readers, was not required for remote access to OPM applications. That made it easy for an attacker with a stolen login and password to bypass all perimeter defenses and directly log into the OPM systems.

An over-reliance on perimeter defenses also means that government agencies pay less attention to where their important data is stored than they should.

According to the 451 Research survey, government respondents were among those with the lowest confidence in the security of their sensitive data’s location. Although 50 percent of financial-sector respondents expressed confidence, only 37 percent of government respondents could say the same.

In fact, only 16 percent of all respondents cited “lack of perceived need” as a barrier to adopting data security, but 31 percent — or almost twice as many — government respondents did so.

Earlier this year, the Ponemon Institute released a report showing that 33 percent of government agencies use encryption extensively, compared to 41 percent of companies in general and far behind the financial sector at 56 percent. In that survey of more than 5,000 technology experts, 16 percent of agency respondents said they had no encryption strategy.

On a positive note, the public sector has been making headway. Last year, for example, only 25 percent of government respondents to the Ponemon survey said they were using encryption extensively.

“This is showing heightened interest in data protection,” said Peter Galvin, vice president of strategy at Thales e-Security, which sponsored the Ponemon report. High-profile data breaches have drawn public attention to the issue, he added.

ON ENCRYPTION: THERE ARE NO LOCKS ONLY “ANGELS” CAN OPEN

Despite the FBI dropping its case against Apple over whether or not the tech giant should supply the government agency with the ability to hack into the San Bernardino shooter’s iPhone, the argument over how our devices — especially our phones — should be encrypted continues to rage on. And regardless of how you feel about the issue, almost everybody agrees that the debate can be pretty murky, as privacy vs. protection debates usually are. To make the whole argument a lot easier to digest however, we have one of the web’s best educators and entertainers, CGP Grey, who has broken it all down in one clear five-minute video.

The video, posted above, parallels physical locks with digital “locks” (encryption), noting how they relate and how they differ in order to help us better understand the encryption debate. And one of the most important points that Grey makes about digital locks is that they need to work not only against local threats, but threats from across the globe — threats coming from “internet burglars” and their “burglar bots.”

Grey touches on the scenario in which a bad guy with an armed bomb dies, leaving behind only an encrypted phone with the code to stop the bomb. In this particular case — a parallel to the San Bernardino shooter case, as there may have been information regarding further threats on his encrypted phone — Grey points out that this may be a time when we’d want the police to have access, or a “backdoor,” to the phone. But if companies were forced to build backdoors into their products so government agencies could use them for situations like these, could we trust authorities not to abuse their powers? Could we trust that “demons” (people with bad intentions) wouldn’t hijack the backdoors?

Grey argues that we couldn’t, saying that “there’s no way to build a digital lock that only angels can open and demons cannot.”

There’s also a bonus “footnote” video (below) in which Grey discusses just how intimate the data on our phones has become (do you remember where you were on April 8th at 6:02AM? No? Your phone does).

What do you think about CGP Grey’s breakdown of the encryption debate?