Mark Zuckerberg Defends Apple’s Stance On Encryption

Mark Zuckerberg Defends Apple’s Stance On Encryption

The real battle for data encryption on our mobile devices has heated up considerably over the past few weeks and is looking to come to a boil relatively soon as tech companies and industry moguls alike join Apple in its defense of encryption. This all began way back in 2013 when Edward Snowden became the whistleblower on the US government’s PRISM domestic spying program, revealing that our mobile devices might be feeding the government more information than anyone had thought of before. Ever since then, tech companies have been slowly locking down our personal information, whether it be adding two-step authorization on logins or actually encrypting the data stored on a mobile device. Google and Apple, among nearly 150 other big-name tech companies, petitioned President Obama to support data encryption nearly a year ago, and now it looks like that letter couldn’t have been more timely.

If you’ve been keeping up with the news lately you’ll know that the San Bernardino shooter used an encrypted iPhone, something Apple has been known for since enacting the standard some time ago. Joining Apple in its quest to secure our data was Google, who has begun requiring phones that ship with Android 6.0 Marshmallow to be fully encrypted, keeping your data secured from apparently even the FBI if they so desire to access it. While the FBI and Apple are currently in a legal battle for the future of encryption, company after company has begun piling on the FBI to get them to deal with the problem another way and save encryption and user’s privacy as a whole.

Facebook owner, CEO and multi-billionaire Mark Zuckerberg has joined this fight, alongside his company, and is fighting to keep encrypted phone sales from being banned in the US. This in addition to the alternative that the FBI has suggested in creating a backdoor in encryption methods for law enforcement, a move that completely removes the purpose of encryption in the first place. Such a move would eliminate the greatest security measure users have to secure their devices, and ultimately defeat the point of encryption in the first place. Zuckerberg’s support of encryption is a huge win for the tech community, and further helps back the opinions of people like John McAfee in stating that the FBI has fallen way behind the times and needs to resort to other methods of dealing with encryption than trying to remove it.

San Bernardino victims to oppose Apple on iPhone encryption

San Bernardino victims to oppose Apple on iPhone encryption

Some victims of the San Bernardino attack will file a legal brief in support of the U.S. government’s attempt to force Apple Inc to unlock the encrypted iPhone belonging to one of the shooters, a lawyer representing the victims said on Sunday.

Stephen Larson, a former federal judge who is now in private practice, told Reuters that the victims he represents have an interest in the information which goes beyond the Justice Department’s criminal investigation.

“They were targeted by terrorists, and they need to know why, how this could happen,” Larson said.

Larson said he was contacted a week ago by the Justice Department and local prosecutors about representing the victims, prior to the dispute becoming public. He said he will file an amicus brief in court by early March.

A Justice Department spokesman declined to comment on the matter on Sunday.

Larson declined to say how many victims he represents. Fourteen people died and 22 others were wounded in the shooting attack by a married couple who were inspired by Islamic State militants and died in a gun battle with police.

Entry into the fray by victims gives the federal government a powerful ally in its fight against Apple, which has cast itself as trying to protect public privacy from overreach by the federal government.

An Apple spokesman declined to comment. In a letter to customers last week, Tim Cook, the company’s chief executive, said: “We mourn the loss of life and want justice for all those whose lives were affected,” saying that the company has “worked hard to support the government’s efforts to solve this horrible crime.”

Federal Bureau of Investigation Director James Comey said in a letter released on Sunday night that the agency’s request wasn’t about setting legal precedent, but rather seeking justice for the victims and investigating other possible threats.

“Fourteen people were slaughtered and many more had their lives and bodies ruined. We owe them a thorough and professional investigation under law. That’s what this is,” Comey wrote.

The FBI is seeking the tech company’s help to access shooter Syed Rizwan Farook’s phone by disabling some of its passcode protections. The company so far has pushed back, arguing that such a move would set a dangerous precedent and threaten customer security.

The clash between Apple and the Justice Department has driven straight to the heart of a long-running debate over how much law enforcement and intelligence officials should be able to monitor digital communications.

The Justice Department won an order in a Riverside, California federal court on Tuesday against Apple, without the company present in court. Apple is scheduled to file its first legal arguments on Friday, and U.S. Magistrate Judge Sheri Pym, who served as a federal prosecutor before being appointed to the bench, has set a hearing on the issue for next month.

Larson once presided over cases in Riverside, and Pym argued cases in Larson’s courtroom several times as a prosecutor while Larson was a judge, he said. Larson returned to private practice in 2009, saying at the time that a judge’s salary was not enough to provide for his seven children.

He said he is representing the San Bernardino victims for free.

What Tim Cook doesn’t want to admit about iPhones and encryption

What Tim Cook doesn't want to admit about iPhones and encryption

When Hillary Clinton called for a “Manhattan-like project” to find a way for the government to spy on criminals without undermining the security of everyone else’s communications, the technology world responded with mockery.

“Also we can create magical ponies who burp ice cream while we’re at it,” snarked prominent Silicon Valley investor Marc Andreessen. Clinton’s idea “makes no sense,” added Techdirt’s Mike Masnick, because “backdooring encryption means that everyone is more exposed to everyone, including malicious hackers.”

It’s an argument that’s been echoed by Apple CEO Tim Cook, who is currently waging a legal battle with the FBI over its request to unlock the iPhone of San Bernardino terrorism suspect Syed Rizwan Farook. “You can’t have a backdoor that’s only for the good guys,” Cook said in November.

There’s just one problem: This isn’t actually true, and the fight over Farook’s iPhone proves it. Apple has tacitly admitted that it can modify the software on Farook’s iPhone to give the FBI access without damaging the security of anyone else’s iPhone.

Claiming that secure back doors are technically impossible is politically convenient. It allows big technology companies like Apple to say that they’d love to help law enforcement but don’t know how to do it without also helping criminals and hackers.

But now, faced with a case where Apple clearly can help law enforcement, Cook is in the awkward position of arguing that it shouldn’t be required to.

Apple isn’t actually worried about the privacy of a dead terrorism suspect. Cook is worried about the legal precedent — not only being forced to help crack more iPhones in the future, but conceivably being forced to build other hacking tools as well.

But by taking a hard line in a case where Apple really could help law enforcement in an important terrorism case — and where doing so wouldn’t directly endanger the security of anyone else’s iPhone — Apple risks giving the impression that tech companies’ objections aren’t being made entirely in good faith.

The San Bernardino case shows secure back doors are possible

What Tim Cook doesn't want to admit about iPhones and encryption

Technologists aren’t lying when they say secure back doors are impossible. They’re just talking about something much narrower than what the term means to a layperson. Specifically, their claim is that it’s impossible to design encryption algorithms that scramble data in a way that the recipient and the government — but no one else — can read.

That’s been conventional wisdom ever since 1994, when a researcher named Matt Blaze demonstrated that a government-backed proposal for a back-doored encryption chip had fatal security flaws. In the two decades since, technologists have become convinced that this is something close to a general principle: It’s very difficult to design encryption algorithms that are vulnerable to eavesdropping by one party but provably secure against everyone else. The strongest encryption algorithms we know about are all designed to be secure against everyone.

But the fact that we don’t know how to make an encryption algorithm that can be compromised only by law enforcement doesn’t imply that we don’t know how to make a technology product that can be unlocked only by law enforcement. In fact, the iPhone 5C that Apple and the FBI are fighting about this week is a perfect example of such a technology product.

You can read about how the hack the FBI has sought would work in my previous coverage, or this even more detailed technical analysis. But the bottom line is that the technology the FBI is requesting — and that Apple has tacitly conceded it could build if forced to do so — accomplishes what many back door opponents have insisted is impossible.

Without Apple’s help, Farook’s iPhone is secure against all known attacks. With Apple’s help, the FBI will be able to crack the encryption on Farook’s iPhone. And helping the FBI crack Farook’s phone won’t help the FBI or anyone else unlock anyone else’s iPhone.

It appears, however, that more recent iPhones are not vulnerable to the same kind of attack. (Update: Apple has told Techcrunch that newer iPhones are also vulnerable.) If Farook had had an iPhone 6S instead of an iPhone 5C, it’s likely (though only Apple knows for sure) that Apple could have truthfully said it had no way to help the FBI extract the data.

That worries law enforcement officials like FBI Director James Comey, who has called on technology companies to work with the government to ensure that encrypted data can always be unscrambled. Comey hasn’t proposed a specific piece of legislation, but he is effectively calling on Apple to stop producing technology products like the iPhone 6S that cannot be hacked even with Apple’s help.

The strongest case against back doors is repressive regimes overseas

What Tim Cook doesn't want to admit about iPhones and encryption

If you have a lot of faith in the US legal system (and you’re not too concerned about the NSA’s creative interpretations of surveillance law), Comey’s demand might seem reasonable. Law enforcement agencies have long had the ability to get copies of almost all types of private communication and data if they first get a warrant. There would be a number of practical problems with legally prohibiting technology products without back doors, but you might wonder why technology companies don’t just voluntarily design their products to comply with lawful warrants.

But things look different from a global perspective. Because if you care about human rights, then you should want to make sure that ordinary citizens in authoritarian countries like China, Cuba, and Saudi Arabia also have access to secure encryption.

And if technology companies provided the US government with backdoor access to smartphones — either voluntarily or under legal compulsion — it would be very difficult for them to refuse to extend the same courtesy to other, more authoritarian regimes. In practice, providing access to the US government also means providing access to the Chinese government.

And this is probably Apple’s strongest argument in its current fight with the FBI. If the US courts refuse to grant the FBI’s request, Apple might be able to tell China that it simply doesn’t have the software required to help hack into the iPhone 5C’s of Chinese suspects. But if Apple were to create the software for the FBI, the Chinese government would likely put immense pressure on Apple to extend it the same courtesy.

Google CEO Pichai Lends Apple Support on Encryption

Google CEO Pichai Lends Apple Support on Encryption

Google Chief Executive Sundar Pichai lent support to Apple Inc.’s  pushback against a federal order to help law enforcement break into the locked iPhone of an alleged shooter in the San Bernardino, Calif., attacks.

Mr. Pichai wrote on Twitter on Wednesday that “forcing companies to enable hacking could compromise users’ privacy.”

Google CEO Pichai Lends Apple Support on Encryption

A federal judge Tuesday ordered Apple to enable investigators to bypass the passcode of the iPhone once used by alleged shooter Syed Rizwan Farook. Apple CEO Tim Cook wrote on Apple’s website that such a move would create “a backdoor” around security measures hackers could eventually use to steal iPhone users’ data.

On Twitter, Mr. Pichai called Mr. Cook’s letter an “important post.” He said that while Alphabet Inc.’s Google provides user data to law enforcement under court orders, “that’s wholly different than requiring companies to enable hacking of customer devices and data. Could be a troubling precedent.”

Google CEO Pichai Lends Apple Support on Encryption

Google, like Apple, has been locked in an intensifying battle with U.S. authorities over the companies’ smartphone encryption software. The firms say that the encryption is crucial to protecting users’ privacy, and keeping their trust. Law enforcement officials say such software hinders criminal investigations, including into the San Bernardino attacks.

Here’s why the FBI forcing Apple to break into an iPhone is a big deal

iphone 4s

When U.S. Magistrate Sheri Pym ruled that Apple must help the FBI break into an iPhone belonging to one of the killers in the San Bernardino, Calif., shootings, the tech world shuddered.

Why? The battle of encryption “backdoors” has been longstanding in Silicon Valley, where a company’s success could be made or broken based on its ability to protect customer data.

The issue came into the spotlight after Edward Snowden disclosed the extent to which technology and phone companies were letting the U.S. federal government spy on data being transmitted through their network.

Since Edward Snowden’s whistleblowing revelations, Facebook, Apple and Twitter have unilaterally said they are not going to create such backdoors anymore.

So here’s the “backdoor” the FBI wants: Right now, iPhone users have the option to set a security feature that only allows a certain number of tries to guess the correct passcode to unlock the phone before all the data on the iPhone is deleted. It’s a security measure Apple put in place to keep important data out of the wrong hands.

Federal prosecutors looking for more information behind the San Bernardino shootings don’t know the phone’s passcode. If they guess incorrectly too many times, the data they hope to find will be deleted.

That’s why the FBI wants Apple to disable the security feature. Once the security is crippled, agents would be able to guess as many combinations as possible.

Kurt Opsahl, general counsel for the Electronic Frontier Foundation, a San Francisco-based digital rights non-profit, explained that this “backdoor” means Apple will have to to write brand new code that will compromise key features of the phone’s security. Apple has five business days to respond to the request.

What does Apple have to say about this? They haven’t commented yet today, but back in December, Apple CEO Tim Cook defended the company’s use of encryption on its mobile devices, saying users should not have to trade privacy for national security, in a broad interview with 60 Minutes. In the interview, Cook stood by the company’s stance of refusing to offer encrypted texts and messages from users.

He said: “There’s likely health information, there’s financial information,” says Cook describing a user’s iPhone. “There are intimate conversations with your family, or your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It’s because if there’s a way to get in, then somebody will find the way in.”

Cook says Apple cooperates with law enforcement requests, but can’t access encrypted information on users’ smartphones. According to a page on Apple’s website detailing government requests, Apple says encryption data is tied to the device’s passcode.

Cook also dismissed the idea that iPhone users should swap privacy for security. “We’re America. We should have both.”

What does this mean for the next time the government wants access? The order doesn’t create a precedent in the sense that other courts will be compelled to follow it, but it will give the government more ammunition.

What do digital rights experts have to say? There are two things that make this order very dangerous, Opsahl said. The first is the question is raises about who can make this type of demand. If the U.S. government can force Apple to do this, why can’t the Chinese or Russian governments?

The second is that while the government is requesting a program to allow it to break into this one, specific iPhone, once the program is created it will essentially be a master key. It would be possible for the government to take this key, modify it and use it on other phones. That risks a lot, that the government will have this power and it will not be misused, he said.

And the lawmakers? Well, they are torn. Key House Democrat, Rep. Adam Schiff, D-Calif., says Congress shouldn’t force tech companies to have encryption backdoors. Congress is struggling with how to handle the complex issue.

On the other side of things, Senate Intelligence Committee Chairman Richard Burr, R-N.C., and Vice Chair Dianne Feinstein, D-Calif., say they want to require tech companies to provide a backdoor into encrypted communication when law enforcement officials obtain a court order to investigate a specific person.

What now? This could push the tech companies to give users access to unbreakable encryption. To some extent, it’s already happening. Companies like Apple and Google — responding to consumer demands for privacy — have developed smart phones and other devices with encryption that is so strong that even the companies can’t break it.

Encryption May Hurt Surveillance, but Internet Of Things Could Open New Doors

Tech companies and privacy advocates have been in a stalemate with government officials over how encrypted communication affects the ability of federal investigators to monitor terrorists and other criminals. A new study by Harvard’s Berkman Center for Internet and Society convened experts from all sides to put the issue in context.

The report concluded that information from some apps and devices like smartphones may be harder for government investigators to intercept because of stronger encryption. But, it said, we are connecting so many more things to the Internet (light bulbs, door locks, watches, toasters) that they could create new surveillance channels.

Encryption May Hurt Surveillance, But Internet Of Things Could Open New Doors

The encryption debate has reheated recently following the attacks in Paris and to some extent San Bernardino, Calif., with CIA and FBI officials warning about their investigation channels “going dark” because of the stronger encryption placed on communications tools like WhatsApp or FaceTime.

(The distinction is this: With things like emails, Web searches, photos or social network posts, information typically gets encrypted on your phone or laptop and then decrypted and stored on a big corporate data server, where law enforcement officials have the technical and legal ability to get access to the content, for instance, with a subpoena. But with messages that are encrypted end-to-end, data gets encrypted on one device and only gets decrypted when it reaches the recipient’s device, making it inaccessible even with a subpoena.)

The agencies have asked for “back doors” into these technologies, though the Obama administration cooled off its push for related legislation late last year over concerns that such security loopholes would also attract hackers and other governments.

But the Harvard report (which was funded by the Hewlett Foundation) argues that “going dark” is a faulty metaphor for the surveillance of the future, thanks to the raft of new technologies that are and likely will remain unencrypted — all the Web-connected home appliances and consumer electronics that sometimes get dubbed the Internet of Things.

Some of the ways the data used to be accessed will undoubtedly become unavailable to investigators, says Jonathan Zittrain, a Harvard professor who was one of the authors. “But the overall landscape is getting brighter and brighter as there are so many more paths by which to achieve surveillance,” he says.

“If you have data flowing or at rest somewhere and it’s held by somebody that can be under the jurisdiction of not just one but multiple governments, those governments at some point or another are going to get around to asking for the data,” he says.

The study team is notable for including technical experts and civil liberties advocates alongside current and former National Security Agency, Defense Department and Justice Department officials. Another chief author was Matthew Olsen, former director of the National Counterterrorism Center and NSA general counsel.

Though not all 14 core members had to agree to every word of the report, they had to approve of the thrust of its findings — with the exception of current NSA officials John DeLong and Anne Neuberger, whose jobs prevented them from signing onto the report (and Zittrain says nothing should be inferred about their views).

The results of the report are a bit ironic: It tries to close one can of worms (the debate over encryption hurting surveillance) but opens another one (the concerns about privacy in the future of Internet-connected everything).

“When you look at it over the long term,” says Zittrain, “with the breadth of ways in which stuff that used to be ephemeral is now becoming digital and stored, the opportunities for surveillance are quite bright, possibly even worryingly so.”

Weak email encryption laws put Aussie consumers at risk of fraud

Weak email encryption laws put Aussie consumers at risk of fraud

A consumer alert issued by Victoria’s Legal Services Commissioner a few weeks ago raised, to our mind, an old and curious issue. Why aren’t Australian professionals required to secure their email?

Eighteen years ago, Victoria’s Law Institute Journal carried an excellent feature article on the ease with which email can be forged, the fact that it was already happening and the gold standard technology for mitigating the risk, digital signatures and encryption. We have to say it was excellent, since we wrote it, but it did get a lot of attention. It even won an award. But it had no practical impact at all.

Fast forward to 2016 and the same State’s Legal Services Commissioner is alarmed by a UK report of an email hoax that fleeced a newly married couple of their home deposit. Just when they were waiting for instructions from their lawyers on where to transfer their hard earned ₤45,000, fraudsters sent a bogus message that impersonated the attorneys and nominated a false bank account. The hapless couple complied and the scammers collected their cash.

UNSECURED SYSTEM

The Victorian Commissioner’s alert includes several good points of advice to consumers, like being cautious about links and attachments in emails from unfamiliar senders and using antivirus software. But curiously, it doesn’t canvass the key technology question raised in the UK report: Why wasn’t the lawyers’ email secured against forgery?

The newlywed groom pointed the finger right at the problem, quoted as saying “‘Losing this money is bad enough. But what makes it worse is that this could have all been avoided if our emails had been encrypted. It seems crazy to ask us to transfer such huge amounts by sending a bank account number.”

The lawyers’ response: “Advantage Property Lawyers said that the firm was not responsible for the couple’s loss. It said its emails were not encrypted but that this was standard industry practice. We stick to the highest industry standards in all aspects of our business.”

So non-encryption, fairly described by Joe Public as crazy, is the standard industry practice in the UK, just as it is in Australia.

There may be more to this than meets the eye. A couple of years after our 1997 article, we were asked to host a media lunch for Phil Zimmerman, the US tech wizard who created the first user friendly email encryption and signing software. We invited a senior officer of the Law Institute, thinking the topic would be of vital interest. Apparently not.

Over lunch, Zimmerman offered to supply the Institute with free copies of the tool so it could lead the profession down the road of best practice. For reasons we didn’t understand then and still don’t, the offer created no interest.

LACK OF INTEREST

We recounted the story of that lunch in this column years later, wondering if that would spark some enquiry into the options for fighting exactly the kind of fraud that’s happening in the UK. Silence. It seems that, at the highest levels, legal eagles’ eyes glaze over when the topic of secure email arises. As long as the entire profession ignores the issue, we can all say that “our emails are not encrypted but this is standard industry practice.”

For the record, encryption can help secure email in two ways. First, it can prove that a message is from an authenticated sender, and hasn’t been tampered with in transit. Optionally, it can also scramble the contents of messages so only the intended recipient can read them. Implementing these protections requires some centralised infrastructure and a way to ensure it is used by the target audience. Australia’s law societies are ideally placed to sponsor a more secure system, especially now that a uniform national legal practice regime is in operation.

We used Zimmerman’s product for a couple of years, and it was simple. Using an Outlook plug in, you clicked a button to send a signed message. You entered a password, the software worked its magic in the background, and a digital signature was applied. We gave it up when it became clear that insecure email was set to remain industry best practice for years to come.

Back in 1997, we wrapped up our article with the wildly inaccurate prediction that “in two years, all commercial documentation will be digitally signed. Lawyers have every reason to lead the way.”

Here’s hoping it doesn’t take another 18 years.

Top senator: Encryption bill may “do more harm than good”

Top senator: Encryption bill may "do more harm than good"

Legislating encryption standards might “do more harm than good” in the fight against terrorism, Senate Homeland Security Committee Chairman Ron Johnson (R-Wis.) said on Thursday.

In the wake of the terrorist attacks in Paris and San Bernardino, Calif., lawmakers have been debating whether to move a bill that would force U.S. companies to decrypt data for law enforcement.

“Is it really going to solve any problems if we force our companies to do something here in the U.S.?” Johnson asked at the American Enterprise Institute, a conservative think tank. “It’s just going to move offshore. Determined actors, terrorists, are still going to be able to find a service provider that will be able to encrypt accounts.”
Investigators have said the Paris attackers used encrypted apps to communicate. It’s part of a growing trend, law enforcement says, in which criminals and terrorists are using encryption to hide from authorities.

For many, the solution has been to require that tech companies maintain the ability to decrypt data when compelled by a court order. Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.) are currently working on such a bill.

But the tech community and privacy advocates have pushed back. They warn that any type of guaranteed access to encrypted data puts all secure information at risk. Keeping a key around to unlock encryption means that anyone, they argue, including hackers can use that key.

Johnson said he understands the importance of strong encryption.

“Let’s face it, encryption helps protect personal information,” he said. “It’s crucial to that. I like the fact that if somebody gets my iPhone, they’re going to have a hard time getting into it.”

Capitol Hill faces a learning curve on the issue, Johnson explained.

“It really is not understanding the complexity,” he said. “And I’m not being critical here. It’s really complex, which is the biggest problem you have in terms of cyber warfare [and] cyberattacks.”

“The experts, the attackers are multiple steps ahead of the good guys trying to reel them in, trying to find them,” Johnson added.

McCaul: US playing ‘catchup’ to terrorists using encryption

McCaul: US playing 'catchup' to terrorists using encryption

The U.S. is playing “catchup” with terrorists and cyber vigilantes who coordinate via encrypted communications, according to the chairman of the House Homeland Security Committee.

“Today’s digital battlefield has many more adversaries that just nation states,” Rep. Michael McCaul (R-Texas) said in a Tuesday column for Bloomberg. “Terrorist groups such as ISIS [the Islamic State in Iraq and Syria], as well as hacktivists … are adept at using encryption technologies to communicate and carry out malicious campaigns, leaving America to play catchup.”

McCaul has been outspoken in the fight between tech companies and law enforcement over the regulation of encryption technology. He is currently prepping legislation that would establish a national commission to find ways to balance the public’s right to privacy with giving police access to encrypted information.

“I do think this is one of the greatest challenges to law enforcement that I have probably seen in my lifetime,” the former federal prosecutor told reporters last week.

Lawmakers are split over whether legislation is needed to address the growing use of technology that can prevent even a device manufacturer from decrypting data.

Tech experts argue that any guaranteed access for law enforcement weakens overall Internet security and makes online transactions such as banking and hotel bookings riskier. Privacy advocates say strong encryption provides important protection to individuals.

But law enforcement officials, along with some lawmakers, continue to argue that impenetrable encryption is a danger to public safety.

“From gang activity to child abductions to national security threats, the ability to access electronic evidence in a timely manner is often essential to successfully conducting lawful investigations and preventing harm to potential victims,” Assistant Attorney General Leslie Caldwell said at the annual State of the Net conference on Monday.

The White House has tried to engage Silicon Valley on the topic, recently meeting with top tech executives on the West Coast. But some lawmakers feel the process should move quicker.

In the upper chamber, Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.)  are working on a bill that would force companies to build their encryption so they could respond to a court order for secured data.

Both members of the Intelligence Committee have expressed a desire to move swiftly on encryption legislation and bypass the proposed national commission to study the topic.

McCaul warned that the threats the U.S. faces online “will only grow more prevalent.”

“The security of Americans’ personal information needs to keep pace with the emerging technologies of today,” McCaul said.

Half-Measures on Encryption Since Snowden

Half-Measures on Encryption Since Snowden

When the NSA subcontractor Edward Snowden released classified documents in June 2013 baring the U.S. intelligence community’s global surveillance programs, it revealed the lax attention to privacy and data security at major Internet companies like Apple, Google, Yahoo, and Microsoft. Warrantless surveillance was possible because data was unencrypted as it flowed between internal company data centers and service providers.

The revelations damaged technology companies’ relationships with businesses and consumers. Various estimates pegged the impact at between $35 billion and $180 billion as foreign business customers canceled service contracts with U.S. cloud computing companies in favor of foreign competitors, and as the companies poured money into PR campaigns to reassure their remaining customers.

There was a silver lining: the revelations catalyzed a movement among technology companies to use encryption to protect users’ data from spying and theft. But the results have been mixed. Major service providers including Google, Yahoo, and Microsoft—who are among the largest providers of cloud- and Web-based services like e-mail, search, storage, and messaging—have indeed encrypted user data flowing across their internal infrastructure. But the same isn’t true in other contexts, such as when data is stored on smartphones or moving across networks in hugely popular messaging apps like Skype and Google Hangouts. Apple is leading the pack: it encrypts data by default on iPhones and other devices running newer versions of its operating system, and it encrypts communications data so that only the sender and receiver have access to it.

When the NSA subcontractor Edward Snowden released classified documents in June 2013 baring the U.S. intelligence community’s global surveillance programs, it revealed the lax attention to privacy and data security at major Internet companies like Apple, Google, Yahoo, and Microsoft. Warrantless surveillance was possible because data was unencrypted as it flowed between internal company data centers and service providers.

The revelations damaged technology companies’ relationships with businesses and consumers. Various estimates pegged the impact at between $35 billion and $180 billion as foreign business customers canceled service contracts with U.S. cloud computing companies in favor of foreign competitors, and as the companies poured money into PR campaigns to reassure their remaining customers.

There was a silver lining: the revelations catalyzed a movement among technology companies to use encryption to protect users’ data from spying and theft. But the results have been mixed. Major service providers including Google, Yahoo, and Microsoft—who are among the largest providers of cloud- and Web-based services like e-mail, search, storage, and messaging—have indeed encrypted user data flowing across their internal infrastructure. But the same isn’t true in other contexts, such as when data is stored on smartphones or moving across networks in hugely popular messaging apps like Skype and Google Hangouts. Apple is leading the pack: it encrypts data by default on iPhones and other devices running newer versions of its operating system, and it encrypts communications data so that only the sender and receiver have access to it.

Half-Measures on Encryption Since Snowden

But Apple products aren’t widely used in the poor world. Of the 3.4 billion smartphones in use worldwide, more than 80 percent run Google’s Android operating system. Many are low-end phones with less built-in protection than iPhones. This has produced a “digital security divide,” says Chris Soghoian, principal technologist at the American Civil Liberties Union. “The phone used by the rich is encrypted by default and cannot be surveilled, and the phone used by most people in the global south and the poor and disadvantaged in America can be surveilled,” he said at MIT Technology Review’s EmTech conference in November.

Pronouncements on new encryption plans quickly followed the Snowden revelations. In November 2013, Yahoo announced that it intended to encrypt data flowing between its data centers and said it would also encrypt traffic moving between a user’s device and its servers (as signaled by the address prefix HTTPS). Microsoft announced in November and December 2013 that it would expand encryption to many of its major products and services, meaning data would be encrypted in transit and on Microsoft’s servers. Google announced in March 2014 that connections to Gmail would use HTTPS and that it would encrypt e-mails sent to other providers who can also support encryption, such as Yahoo. And finally, in 2013 and 2014, Apple implemented the most dramatic changes of all, announcing that the latest version of iOS, the operating system that runs on all iPhones and iPads, would include built-in end-to-end encrypted text and video messaging. Importantly, Apple also announced it would store the keys to decrypt this information only on users’ phones, not on Apple’s servers—making it far more difficult for a hacker, an insider at Apple, or even government officials with a court order to gain access.

Google, Microsoft, and Yahoo don’t provide such end-to-end encryption of communications data. But users can turn to a rising crop of free third-party apps, like ChatSecure and Signal, that support such encryption and open their source code for review. Relatively few users take the extra step to learn about and use these tools. Still, secure messaging apps may play a key role in making it easier to implement wider encryption across the Internet, says Stephen Farrell, a computer scientist at Trinity College Dublin and a leader of security efforts at the Internet Engineering Task Force, which develops fundamental Internet protocols. “Large messaging providers need to get experience with deployment of end-to-end secure messaging and then return to the standards process with that experience,” he says. “That is what will be needed to really address the Internet-scale messaging security problem.”