What Tim Cook doesn’t want to admit about iPhones and encryption

What Tim Cook doesn't want to admit about iPhones and encryption

When Hillary Clinton called for a “Manhattan-like project” to find a way for the government to spy on criminals without undermining the security of everyone else’s communications, the technology world responded with mockery.

“Also we can create magical ponies who burp ice cream while we’re at it,” snarked prominent Silicon Valley investor Marc Andreessen. Clinton’s idea “makes no sense,” added Techdirt’s Mike Masnick, because “backdooring encryption means that everyone is more exposed to everyone, including malicious hackers.”

It’s an argument that’s been echoed by Apple CEO Tim Cook, who is currently waging a legal battle with the FBI over its request to unlock the iPhone of San Bernardino terrorism suspect Syed Rizwan Farook. “You can’t have a backdoor that’s only for the good guys,” Cook said in November.

There’s just one problem: This isn’t actually true, and the fight over Farook’s iPhone proves it. Apple has tacitly admitted that it can modify the software on Farook’s iPhone to give the FBI access without damaging the security of anyone else’s iPhone.

Claiming that secure back doors are technically impossible is politically convenient. It allows big technology companies like Apple to say that they’d love to help law enforcement but don’t know how to do it without also helping criminals and hackers.

But now, faced with a case where Apple clearly can help law enforcement, Cook is in the awkward position of arguing that it shouldn’t be required to.

Apple isn’t actually worried about the privacy of a dead terrorism suspect. Cook is worried about the legal precedent — not only being forced to help crack more iPhones in the future, but conceivably being forced to build other hacking tools as well.

But by taking a hard line in a case where Apple really could help law enforcement in an important terrorism case — and where doing so wouldn’t directly endanger the security of anyone else’s iPhone — Apple risks giving the impression that tech companies’ objections aren’t being made entirely in good faith.

The San Bernardino case shows secure back doors are possible

What Tim Cook doesn't want to admit about iPhones and encryption

Technologists aren’t lying when they say secure back doors are impossible. They’re just talking about something much narrower than what the term means to a layperson. Specifically, their claim is that it’s impossible to design encryption algorithms that scramble data in a way that the recipient and the government — but no one else — can read.

That’s been conventional wisdom ever since 1994, when a researcher named Matt Blaze demonstrated that a government-backed proposal for a back-doored encryption chip had fatal security flaws. In the two decades since, technologists have become convinced that this is something close to a general principle: It’s very difficult to design encryption algorithms that are vulnerable to eavesdropping by one party but provably secure against everyone else. The strongest encryption algorithms we know about are all designed to be secure against everyone.

But the fact that we don’t know how to make an encryption algorithm that can be compromised only by law enforcement doesn’t imply that we don’t know how to make a technology product that can be unlocked only by law enforcement. In fact, the iPhone 5C that Apple and the FBI are fighting about this week is a perfect example of such a technology product.

You can read about how the hack the FBI has sought would work in my previous coverage, or this even more detailed technical analysis. But the bottom line is that the technology the FBI is requesting — and that Apple has tacitly conceded it could build if forced to do so — accomplishes what many back door opponents have insisted is impossible.

Without Apple’s help, Farook’s iPhone is secure against all known attacks. With Apple’s help, the FBI will be able to crack the encryption on Farook’s iPhone. And helping the FBI crack Farook’s phone won’t help the FBI or anyone else unlock anyone else’s iPhone.

It appears, however, that more recent iPhones are not vulnerable to the same kind of attack. (Update: Apple has told Techcrunch that newer iPhones are also vulnerable.) If Farook had had an iPhone 6S instead of an iPhone 5C, it’s likely (though only Apple knows for sure) that Apple could have truthfully said it had no way to help the FBI extract the data.

That worries law enforcement officials like FBI Director James Comey, who has called on technology companies to work with the government to ensure that encrypted data can always be unscrambled. Comey hasn’t proposed a specific piece of legislation, but he is effectively calling on Apple to stop producing technology products like the iPhone 6S that cannot be hacked even with Apple’s help.

The strongest case against back doors is repressive regimes overseas

What Tim Cook doesn't want to admit about iPhones and encryption

If you have a lot of faith in the US legal system (and you’re not too concerned about the NSA’s creative interpretations of surveillance law), Comey’s demand might seem reasonable. Law enforcement agencies have long had the ability to get copies of almost all types of private communication and data if they first get a warrant. There would be a number of practical problems with legally prohibiting technology products without back doors, but you might wonder why technology companies don’t just voluntarily design their products to comply with lawful warrants.

But things look different from a global perspective. Because if you care about human rights, then you should want to make sure that ordinary citizens in authoritarian countries like China, Cuba, and Saudi Arabia also have access to secure encryption.

And if technology companies provided the US government with backdoor access to smartphones — either voluntarily or under legal compulsion — it would be very difficult for them to refuse to extend the same courtesy to other, more authoritarian regimes. In practice, providing access to the US government also means providing access to the Chinese government.

And this is probably Apple’s strongest argument in its current fight with the FBI. If the US courts refuse to grant the FBI’s request, Apple might be able to tell China that it simply doesn’t have the software required to help hack into the iPhone 5C’s of Chinese suspects. But if Apple were to create the software for the FBI, the Chinese government would likely put immense pressure on Apple to extend it the same courtesy.

Google CEO Pichai Lends Apple Support on Encryption

Google CEO Pichai Lends Apple Support on Encryption

Google Chief Executive Sundar Pichai lent support to Apple Inc.’s  pushback against a federal order to help law enforcement break into the locked iPhone of an alleged shooter in the San Bernardino, Calif., attacks.

Mr. Pichai wrote on Twitter on Wednesday that “forcing companies to enable hacking could compromise users’ privacy.”

Google CEO Pichai Lends Apple Support on Encryption

A federal judge Tuesday ordered Apple to enable investigators to bypass the passcode of the iPhone once used by alleged shooter Syed Rizwan Farook. Apple CEO Tim Cook wrote on Apple’s website that such a move would create “a backdoor” around security measures hackers could eventually use to steal iPhone users’ data.

On Twitter, Mr. Pichai called Mr. Cook’s letter an “important post.” He said that while Alphabet Inc.’s Google provides user data to law enforcement under court orders, “that’s wholly different than requiring companies to enable hacking of customer devices and data. Could be a troubling precedent.”

Google CEO Pichai Lends Apple Support on Encryption

Google, like Apple, has been locked in an intensifying battle with U.S. authorities over the companies’ smartphone encryption software. The firms say that the encryption is crucial to protecting users’ privacy, and keeping their trust. Law enforcement officials say such software hinders criminal investigations, including into the San Bernardino attacks.

Encryption May Hurt Surveillance, but Internet Of Things Could Open New Doors

Tech companies and privacy advocates have been in a stalemate with government officials over how encrypted communication affects the ability of federal investigators to monitor terrorists and other criminals. A new study by Harvard’s Berkman Center for Internet and Society convened experts from all sides to put the issue in context.

The report concluded that information from some apps and devices like smartphones may be harder for government investigators to intercept because of stronger encryption. But, it said, we are connecting so many more things to the Internet (light bulbs, door locks, watches, toasters) that they could create new surveillance channels.

Encryption May Hurt Surveillance, But Internet Of Things Could Open New Doors

The encryption debate has reheated recently following the attacks in Paris and to some extent San Bernardino, Calif., with CIA and FBI officials warning about their investigation channels “going dark” because of the stronger encryption placed on communications tools like WhatsApp or FaceTime.

(The distinction is this: With things like emails, Web searches, photos or social network posts, information typically gets encrypted on your phone or laptop and then decrypted and stored on a big corporate data server, where law enforcement officials have the technical and legal ability to get access to the content, for instance, with a subpoena. But with messages that are encrypted end-to-end, data gets encrypted on one device and only gets decrypted when it reaches the recipient’s device, making it inaccessible even with a subpoena.)

The agencies have asked for “back doors” into these technologies, though the Obama administration cooled off its push for related legislation late last year over concerns that such security loopholes would also attract hackers and other governments.

But the Harvard report (which was funded by the Hewlett Foundation) argues that “going dark” is a faulty metaphor for the surveillance of the future, thanks to the raft of new technologies that are and likely will remain unencrypted — all the Web-connected home appliances and consumer electronics that sometimes get dubbed the Internet of Things.

Some of the ways the data used to be accessed will undoubtedly become unavailable to investigators, says Jonathan Zittrain, a Harvard professor who was one of the authors. “But the overall landscape is getting brighter and brighter as there are so many more paths by which to achieve surveillance,” he says.

“If you have data flowing or at rest somewhere and it’s held by somebody that can be under the jurisdiction of not just one but multiple governments, those governments at some point or another are going to get around to asking for the data,” he says.

The study team is notable for including technical experts and civil liberties advocates alongside current and former National Security Agency, Defense Department and Justice Department officials. Another chief author was Matthew Olsen, former director of the National Counterterrorism Center and NSA general counsel.

Though not all 14 core members had to agree to every word of the report, they had to approve of the thrust of its findings — with the exception of current NSA officials John DeLong and Anne Neuberger, whose jobs prevented them from signing onto the report (and Zittrain says nothing should be inferred about their views).

The results of the report are a bit ironic: It tries to close one can of worms (the debate over encryption hurting surveillance) but opens another one (the concerns about privacy in the future of Internet-connected everything).

“When you look at it over the long term,” says Zittrain, “with the breadth of ways in which stuff that used to be ephemeral is now becoming digital and stored, the opportunities for surveillance are quite bright, possibly even worryingly so.”

Weak email encryption laws put Aussie consumers at risk of fraud

Weak email encryption laws put Aussie consumers at risk of fraud

A consumer alert issued by Victoria’s Legal Services Commissioner a few weeks ago raised, to our mind, an old and curious issue. Why aren’t Australian professionals required to secure their email?

Eighteen years ago, Victoria’s Law Institute Journal carried an excellent feature article on the ease with which email can be forged, the fact that it was already happening and the gold standard technology for mitigating the risk, digital signatures and encryption. We have to say it was excellent, since we wrote it, but it did get a lot of attention. It even won an award. But it had no practical impact at all.

Fast forward to 2016 and the same State’s Legal Services Commissioner is alarmed by a UK report of an email hoax that fleeced a newly married couple of their home deposit. Just when they were waiting for instructions from their lawyers on where to transfer their hard earned ₤45,000, fraudsters sent a bogus message that impersonated the attorneys and nominated a false bank account. The hapless couple complied and the scammers collected their cash.

UNSECURED SYSTEM

The Victorian Commissioner’s alert includes several good points of advice to consumers, like being cautious about links and attachments in emails from unfamiliar senders and using antivirus software. But curiously, it doesn’t canvass the key technology question raised in the UK report: Why wasn’t the lawyers’ email secured against forgery?

The newlywed groom pointed the finger right at the problem, quoted as saying “‘Losing this money is bad enough. But what makes it worse is that this could have all been avoided if our emails had been encrypted. It seems crazy to ask us to transfer such huge amounts by sending a bank account number.”

The lawyers’ response: “Advantage Property Lawyers said that the firm was not responsible for the couple’s loss. It said its emails were not encrypted but that this was standard industry practice. We stick to the highest industry standards in all aspects of our business.”

So non-encryption, fairly described by Joe Public as crazy, is the standard industry practice in the UK, just as it is in Australia.

There may be more to this than meets the eye. A couple of years after our 1997 article, we were asked to host a media lunch for Phil Zimmerman, the US tech wizard who created the first user friendly email encryption and signing software. We invited a senior officer of the Law Institute, thinking the topic would be of vital interest. Apparently not.

Over lunch, Zimmerman offered to supply the Institute with free copies of the tool so it could lead the profession down the road of best practice. For reasons we didn’t understand then and still don’t, the offer created no interest.

LACK OF INTEREST

We recounted the story of that lunch in this column years later, wondering if that would spark some enquiry into the options for fighting exactly the kind of fraud that’s happening in the UK. Silence. It seems that, at the highest levels, legal eagles’ eyes glaze over when the topic of secure email arises. As long as the entire profession ignores the issue, we can all say that “our emails are not encrypted but this is standard industry practice.”

For the record, encryption can help secure email in two ways. First, it can prove that a message is from an authenticated sender, and hasn’t been tampered with in transit. Optionally, it can also scramble the contents of messages so only the intended recipient can read them. Implementing these protections requires some centralised infrastructure and a way to ensure it is used by the target audience. Australia’s law societies are ideally placed to sponsor a more secure system, especially now that a uniform national legal practice regime is in operation.

We used Zimmerman’s product for a couple of years, and it was simple. Using an Outlook plug in, you clicked a button to send a signed message. You entered a password, the software worked its magic in the background, and a digital signature was applied. We gave it up when it became clear that insecure email was set to remain industry best practice for years to come.

Back in 1997, we wrapped up our article with the wildly inaccurate prediction that “in two years, all commercial documentation will be digitally signed. Lawyers have every reason to lead the way.”

Here’s hoping it doesn’t take another 18 years.

Top senator: Encryption bill may “do more harm than good”

Top senator: Encryption bill may "do more harm than good"

Legislating encryption standards might “do more harm than good” in the fight against terrorism, Senate Homeland Security Committee Chairman Ron Johnson (R-Wis.) said on Thursday.

In the wake of the terrorist attacks in Paris and San Bernardino, Calif., lawmakers have been debating whether to move a bill that would force U.S. companies to decrypt data for law enforcement.

“Is it really going to solve any problems if we force our companies to do something here in the U.S.?” Johnson asked at the American Enterprise Institute, a conservative think tank. “It’s just going to move offshore. Determined actors, terrorists, are still going to be able to find a service provider that will be able to encrypt accounts.”
Investigators have said the Paris attackers used encrypted apps to communicate. It’s part of a growing trend, law enforcement says, in which criminals and terrorists are using encryption to hide from authorities.

For many, the solution has been to require that tech companies maintain the ability to decrypt data when compelled by a court order. Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.) are currently working on such a bill.

But the tech community and privacy advocates have pushed back. They warn that any type of guaranteed access to encrypted data puts all secure information at risk. Keeping a key around to unlock encryption means that anyone, they argue, including hackers can use that key.

Johnson said he understands the importance of strong encryption.

“Let’s face it, encryption helps protect personal information,” he said. “It’s crucial to that. I like the fact that if somebody gets my iPhone, they’re going to have a hard time getting into it.”

Capitol Hill faces a learning curve on the issue, Johnson explained.

“It really is not understanding the complexity,” he said. “And I’m not being critical here. It’s really complex, which is the biggest problem you have in terms of cyber warfare [and] cyberattacks.”

“The experts, the attackers are multiple steps ahead of the good guys trying to reel them in, trying to find them,” Johnson added.

McCaul: US playing ‘catchup’ to terrorists using encryption

McCaul: US playing 'catchup' to terrorists using encryption

The U.S. is playing “catchup” with terrorists and cyber vigilantes who coordinate via encrypted communications, according to the chairman of the House Homeland Security Committee.

“Today’s digital battlefield has many more adversaries that just nation states,” Rep. Michael McCaul (R-Texas) said in a Tuesday column for Bloomberg. “Terrorist groups such as ISIS [the Islamic State in Iraq and Syria], as well as hacktivists … are adept at using encryption technologies to communicate and carry out malicious campaigns, leaving America to play catchup.”

McCaul has been outspoken in the fight between tech companies and law enforcement over the regulation of encryption technology. He is currently prepping legislation that would establish a national commission to find ways to balance the public’s right to privacy with giving police access to encrypted information.

“I do think this is one of the greatest challenges to law enforcement that I have probably seen in my lifetime,” the former federal prosecutor told reporters last week.

Lawmakers are split over whether legislation is needed to address the growing use of technology that can prevent even a device manufacturer from decrypting data.

Tech experts argue that any guaranteed access for law enforcement weakens overall Internet security and makes online transactions such as banking and hotel bookings riskier. Privacy advocates say strong encryption provides important protection to individuals.

But law enforcement officials, along with some lawmakers, continue to argue that impenetrable encryption is a danger to public safety.

“From gang activity to child abductions to national security threats, the ability to access electronic evidence in a timely manner is often essential to successfully conducting lawful investigations and preventing harm to potential victims,” Assistant Attorney General Leslie Caldwell said at the annual State of the Net conference on Monday.

The White House has tried to engage Silicon Valley on the topic, recently meeting with top tech executives on the West Coast. But some lawmakers feel the process should move quicker.

In the upper chamber, Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.)  are working on a bill that would force companies to build their encryption so they could respond to a court order for secured data.

Both members of the Intelligence Committee have expressed a desire to move swiftly on encryption legislation and bypass the proposed national commission to study the topic.

McCaul warned that the threats the U.S. faces online “will only grow more prevalent.”

“The security of Americans’ personal information needs to keep pace with the emerging technologies of today,” McCaul said.

Half-Measures on Encryption Since Snowden

Half-Measures on Encryption Since Snowden

When the NSA subcontractor Edward Snowden released classified documents in June 2013 baring the U.S. intelligence community’s global surveillance programs, it revealed the lax attention to privacy and data security at major Internet companies like Apple, Google, Yahoo, and Microsoft. Warrantless surveillance was possible because data was unencrypted as it flowed between internal company data centers and service providers.

The revelations damaged technology companies’ relationships with businesses and consumers. Various estimates pegged the impact at between $35 billion and $180 billion as foreign business customers canceled service contracts with U.S. cloud computing companies in favor of foreign competitors, and as the companies poured money into PR campaigns to reassure their remaining customers.

There was a silver lining: the revelations catalyzed a movement among technology companies to use encryption to protect users’ data from spying and theft. But the results have been mixed. Major service providers including Google, Yahoo, and Microsoft—who are among the largest providers of cloud- and Web-based services like e-mail, search, storage, and messaging—have indeed encrypted user data flowing across their internal infrastructure. But the same isn’t true in other contexts, such as when data is stored on smartphones or moving across networks in hugely popular messaging apps like Skype and Google Hangouts. Apple is leading the pack: it encrypts data by default on iPhones and other devices running newer versions of its operating system, and it encrypts communications data so that only the sender and receiver have access to it.

When the NSA subcontractor Edward Snowden released classified documents in June 2013 baring the U.S. intelligence community’s global surveillance programs, it revealed the lax attention to privacy and data security at major Internet companies like Apple, Google, Yahoo, and Microsoft. Warrantless surveillance was possible because data was unencrypted as it flowed between internal company data centers and service providers.

The revelations damaged technology companies’ relationships with businesses and consumers. Various estimates pegged the impact at between $35 billion and $180 billion as foreign business customers canceled service contracts with U.S. cloud computing companies in favor of foreign competitors, and as the companies poured money into PR campaigns to reassure their remaining customers.

There was a silver lining: the revelations catalyzed a movement among technology companies to use encryption to protect users’ data from spying and theft. But the results have been mixed. Major service providers including Google, Yahoo, and Microsoft—who are among the largest providers of cloud- and Web-based services like e-mail, search, storage, and messaging—have indeed encrypted user data flowing across their internal infrastructure. But the same isn’t true in other contexts, such as when data is stored on smartphones or moving across networks in hugely popular messaging apps like Skype and Google Hangouts. Apple is leading the pack: it encrypts data by default on iPhones and other devices running newer versions of its operating system, and it encrypts communications data so that only the sender and receiver have access to it.

Half-Measures on Encryption Since Snowden

But Apple products aren’t widely used in the poor world. Of the 3.4 billion smartphones in use worldwide, more than 80 percent run Google’s Android operating system. Many are low-end phones with less built-in protection than iPhones. This has produced a “digital security divide,” says Chris Soghoian, principal technologist at the American Civil Liberties Union. “The phone used by the rich is encrypted by default and cannot be surveilled, and the phone used by most people in the global south and the poor and disadvantaged in America can be surveilled,” he said at MIT Technology Review’s EmTech conference in November.

Pronouncements on new encryption plans quickly followed the Snowden revelations. In November 2013, Yahoo announced that it intended to encrypt data flowing between its data centers and said it would also encrypt traffic moving between a user’s device and its servers (as signaled by the address prefix HTTPS). Microsoft announced in November and December 2013 that it would expand encryption to many of its major products and services, meaning data would be encrypted in transit and on Microsoft’s servers. Google announced in March 2014 that connections to Gmail would use HTTPS and that it would encrypt e-mails sent to other providers who can also support encryption, such as Yahoo. And finally, in 2013 and 2014, Apple implemented the most dramatic changes of all, announcing that the latest version of iOS, the operating system that runs on all iPhones and iPads, would include built-in end-to-end encrypted text and video messaging. Importantly, Apple also announced it would store the keys to decrypt this information only on users’ phones, not on Apple’s servers—making it far more difficult for a hacker, an insider at Apple, or even government officials with a court order to gain access.

Google, Microsoft, and Yahoo don’t provide such end-to-end encryption of communications data. But users can turn to a rising crop of free third-party apps, like ChatSecure and Signal, that support such encryption and open their source code for review. Relatively few users take the extra step to learn about and use these tools. Still, secure messaging apps may play a key role in making it easier to implement wider encryption across the Internet, says Stephen Farrell, a computer scientist at Trinity College Dublin and a leader of security efforts at the Internet Engineering Task Force, which develops fundamental Internet protocols. “Large messaging providers need to get experience with deployment of end-to-end secure messaging and then return to the standards process with that experience,” he says. “That is what will be needed to really address the Internet-scale messaging security problem.”

Cisco Security Report: Dwell time and encryption security struggles

Cisco Security Report: Dwell time and encryption security struggles

The 2016 Cisco Security Report highlighted the duality of cybersecurity and described a number of issues, including encryption security and dwell time as a constant struggle between threat actors looking for more effective and efficient attack techniques and security providers responding to those changes.

One of the statistics in the report that could have been spun as a net positive for Cisco was that since May, Cisco reduced the median time to detection (or dwell time) of known threats on its networks to 17 hours. However, Jason Brvenik, principal engineer for the Security Business Group at Cisco, noted that this metric was more representative of the “push and pull” between threat actors and security and should be used more as a way to see which side is improving at a given time.

“Our point in talking about time to detection is that it’s a durable metric that organizations can use, establish and measure to help them understand how well they’re doing and what their opportunity is to improve,” Brvenik said. “And, if you don’t start paying attention to time to detection, then the attacker basically has unfettered access until you get that.”

Fred Kost, senior vice president at HyTrust, said that although dwell time is an important security metric, it is reactive and not preventative.

“Time to detection will vary over time as the cat and mouse game plays out between attackers and defenders,” Kost said. “Part of the challenge for enterprises is the improving ability of attackers to remain covert once they have access to the network and servers, driving the need to have better segmentation and controls on what privileges users have, especially as virtualization and cloud makes access to a greater number of systems more likely.”

Brvenik said Cisco has had some success in bringing down dwell time, but that this measurement would ultimately vary.

“I fully expect that [threat actors] are going to recognize the lack of ROI or the reduction of ROI they’re getting and they’re going to come back and try something new,” Brvenik said. “As a defender, you have to be right 100% of the time and the attacker only has to be right once.”

Cisco found this was already true in the evolution of botnets and exploit kits (EK) like Angler. The Security Report showed that attackers using Angler had large scale campaigns with 90,000 targets per server per day, 10% of which were served exploits. Of those served exploits, 40% were compromised and 62% of those were served ransomware. Though only a small fraction paid the ransom (2.9%) and each instance was a few hundred dollars, that still added up to $34 million per ransomware campaign over the course of a year.

Craig Williams, senior technical leader and outreach manager at Cisco, said the advancements seen in how attackers use the Angler EK and botnets can be directly attributed to the security industry getting better at its job. Williams described how five or ten years ago, botnets were simple setups of one server connecting to another, so it was easy to block the host server to take down the botnet. But, attackers have found a way to use Angler to make this much more difficult to stop.

“The way that they set up the network to host these exploits is really intelligently architected around the fact that they want to have the ability to rotate servers as we take them down. You can kind of think of it like a Hydra,” Williams said. “When the customer gets redirected to the Angler exploit kit’s landing page, to them it looks like the front-end proxy server is all there is. But, the reality is that behind the scenes it’s actually being connected to another server hosting the exploit and yet a third server that’s actually continuously pinging it to make sure it’s online. The second it goes down from an abuse ticket or blocked by a good guy, it’ll actually rotate that server out and replace it with another server with a completely different IP address. So, effectively cutting the head off the Hydra, another head pops up in place and takes over. It’s a really unique design and I think it’s one that we’ve seen and will continue to see people evolve to just because it’s a little more efficient way to be a bad guy and that’s just the nature of the game in this day and age.”

Another subject that Cisco found to have both positive and negative consequences was encryption. The report stated that encryption can create security issues for organizations, including a false sense of security. Research found that encrypted traffic, especially HTTPS, crossed a tipping point in 2015 and now more than 50% of bytes transferred were encrypted over the year. But, Brvenik said this is something organizations need to plan for because it means “they’re rapidly losing visibility into some of the threats that can present there.”

Williams noted that while the push towards encryption is good from a privacy standpoint, it will also introduce “significant security issues.” Williams said the biggest misconceptions were that people tend to think if something is encrypted, it is safe, and that more encryption is always better.

“Think about what encryption was designed to be used for — only the sensitive pieces of data. That’s how encryption is intended to be used,” Williams said. “An advertisement from a website is not a sensitive piece of data and it shouldn’t be encrypted. If it is, then you’re effectively hiding any potential attacks from detection systems. So, even if your company has IPS or in-line antivirus, you’re not going to see potential attacks.”

Brvenik said the loss of visibility will have cascading impacts and organizations need to plan security strategies now.

“The impact of a lack of visibility in one layer will affect others. There are solutions that can move to the endpoint; there are solutions that can move to decapsulation; there are a lot of approaches there,” Brvenik said. “The point is — they need to start thinking about it now because they’re going to find themselves in a situation where it’s too late.”

Gur Shatz, co-founder and chief technology officer of Cato Networks, said enterprises need to be careful about how they plan security strategies because dealing with encrypted data can be resource-intensive.

“Encrypted traffic requires decryption before it can be analyzed. This is a CPU-intensive process, and could add latency,” Shatz said. “Ideally, you want to decrypt once, and do all the threat detection (multiple layers) on the decrypted traffic. When using point solutions, each one will need to decrypt the traffic separately, potentially slowing down traffic. On the flip side, some enterprises will want to choose and integrate best-of-breed point solutions, because they believe they can get better detection.”

Jeff Schilling, CSO for Armor, said the latency issue could force difficult decisions.

“More complex encryption algorithms are harder to decrypt for Layer 7 inspection, looking for common web OWASP top ten application attacks. This is driving the web industry to look to CDN application inspection architectures, which can inject latency, which many of our customers can’t tolerate,” Schilling said. “We have to ask ourselves, which problem has more risk? Threat actors decrypting data or launching application layer attacks? I think there is more risk in the latter.”

One attack vector that Cisco said was being overlooked was in malicious browser extensions. Cisco’s research found that more than 85% of organizations encounter malicious extensions in the browser, which can lead to leaked data, stolen account credentials, and even attackers installing malicious software on a victim’s computer.

Williams said this is especially dangerous because the browser is the largest attack surface in an organization. But, Williams also said that this should be a very easy problem to fix, because although internal Web apps may need a specific plugin or browser version, the tools exist to secure the enterprise environment.

“The reality is in this day and age there are so many different types of browsers out there and so many different ways to install those, that you can easily have a secured browser for the Internet and another browser you use because you have to have a specific plugin or a specific variant,” Williams said. “You can determine this from the network. There is no reason companies should allow insecure browsers to access the Internet anymore. We have the technology. We have solutions that can filter out vulnerable browsers and just prevent them from connecting out.”

Robert Hansen, vice president of WhiteHat Labs at WhiteHat Security, said enterprises should have strong policies about what browser extensions can be installed by employees.

“Browser extensions often leak data about their presence, people’s web-surfing habits, and other system level information. Sometimes this can be fairly innocuous (for instance anonymized metadata about usage) and sometimes it can be incredibly dangerous, like full URL paths of internal sensitive devices,” Hansen said. “In general, people really shouldn’t be installing their own browser extensions – that should be for IT to vet and do for them to ensure they aren’t inadvertently installing something malicious.”

British voice encryption protocol has massive weakness, researcher says

A protocol designed and promoted by the British government for encrypting voice calls has a by-design weakness built into it that could allow for mass surveillance, according to a University College London researcher.

Steven Murdoch, who works in the university’s Information Security Research Group, analyzed a protocol developed by CESG, which is part of the spy agency GCHQ.

The MIKEY-SAKKE (Multimedia Internet KEYing-Sakai-KasaharaKey Encryption) protocol calls for a master decryption key to be held by a service provider, he wrote in an analysis published Tuesday.

Cryptography engineers seeking to build secure systems avoid this approach, known as key escrow, as it makes whatever entity holding the key a target for attack. It also makes the data of users more vulnerable to legal action, such as secret court orders.

The approach taken by the British government is not surprising given that it has frequently expressed its concerns over how encryption could inhibit law enforcement and impact terrorism-related investigations.

The technology industry and governments have been embroiled in a fierce ongoing debate over encryption, with tech giants saying building intentionally weak cryptography systems could provide attack vectors for nation-state adversaries and hackers.

Murdoch wrote CESG is well aware of the implications of its design. Interestingly, the phrase “key escrow” is never used in the protocol’s specification.

“This is presented as a feature rather than bug, with the motivating case in the GCHQ documentation being to allow companies to listen to their employees calls when investigating misconduct, such as in the financial industry,” he wrote.

The endorsement of the protocol has wide-ranging implications for technology vendors. Murdoch wrote that the British government will only certify voice encryption products that use it. The government’s recommendations also influence purchasing decisions throughout the industry.

“As a result, MIKEY-SAKKE has a monopoly over the vast majority of classified U.K. government voice communication, and so companies developing secure voice communication systems must implement it in order to gain access to this market,” he wrote.

GCHA has already begun certifying products under its Commercial Product Assurance (CPA) security evaluation program. Approved products must use MIKEY-SAKKE and also Secure Chorus, an open-source code library that ensure interoperability between different devices.

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

During last night’s democratic debate we were once again inundated with calls from politicians who sought compromise from Silicon Valley in its on-going battle with terrorism. Encryption was the point of contention.

The candidates echoed previous statements regarding the dangerous world we live in. The reason for danger, or so it goes, is the inability of law enforcement to pursue threats from terrorists, both domestic and international, who are increasingly reliant on encryption to communicate.

The sentiment is true, albeit misguided, but more on that in a moment.

Currently, the argument is painted as black and white, a “you’re either for us, or against us” exchange that leaves average Americans scratching their collective heads wondering why Silicon Valley isn’t stepping up the fight against terrorism by cooperating with government.

Arguments, even this one, are rarely binary.

In fact, from a security standpoint, the compromise the government seeks is impossible.

“Technically, there is no such backdoor that only the government can access,” says cyber security expert Swati Khandelwal of The Hacker News. “If surveillance tools can exploit ‘vulnerability by design,’ then an attacker who gained access to it would enjoy the same privilege.”

Microsoft MVP of developer security, Troy Hunt adds:

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

The encryption smear campaign

For all that encryption does for us, it has become a quagmire of political talking points and general misuderstanding by citizens and I’d argue, politicians.

“The truth is that encryption is a tool that is used for good, by all of us who use the internet everyday,” says famed computer security expert Graham Cluley.

“Encryption is a tool for freedom. Freedom to express yourself. Freedom to be private. Freedom to keep your personal data out of the hands of hackers.”

The term itself has become a bit of a paradox. Numbers paint a picture of citizens who think it’s important but have no real idea of how and where it protects them.

According to a Pew Research report, fewer than 40 percent of US citizens feel their data is safe online, yet only 10 percent of adults say they’ve used encrypted phone calls, text messages or email and 9 percent have tried to cover online footprints with a proxy, VPN or TOR.

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

These numbers demonstrate a fundamental misunderstanding of encryption and further detail its public perception.

It is, after all, only natural to attempt to protect yourself when you can foresee a threat, yet US citizens have a rather apathetic view of the very technology that could make them safer online.

According to the experts in the field that I spoke with, they all seem to agree that there are two reasons people aren’t taking more steps to remain secure.

  1. Barrier to entry: These technologies feature a lot of jargon and many aren’t all that user friendly. PGP for example, the email encryption technology used by Edward Snowden to communicate with Laura Poitras and Glenn Greenwald, involves relatively-foreign setup instructions for your average citizen.
  2. Negative connotation: Most Americans don’t realize they use encryption every day of their lives. Instead, they know encryption as the tool terrorists use to send private messages, recruit new members and spread propaganda online. This is largely due to the on-going encryption debate.

This debate, whether planned or incidental, is doubling as a smear campaign for the very suite of tools that keeps our online lives secure.

It wasn’t mentioned amongst our expert panel, but I believe there is a third reason that the general public isn’t taking action to better secure themselves online.

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

Silicon Valley isn’t backing down

So far, the term “debate” may be more of a misnomer. Silicon Valley isn’t debating anything. Furthermore, there may not be anything to debate in the first place.

You can’t “compromise” on weakened security; either it’s secure, or it isn’t.

This veritable Pandora’s Box the US Government wants to explore would lead to backdoor access into our personal lives not just for the government but for hackers and bad actors around the globe. And once you open it, there’s no going back.

“You can’t have secure encryption with a government backdoor. Those two things are mutually exclusive,” notes Cyber security thought leader at Script Rock, Jon Hendren. “‘Working with Silicon Valley’ is essentially code for ‘Giving us a backdoor into the accounts, data, and personal lives of users’ and should be totally unacceptable to anyone who even remotely values their privacy.”

There’s also the issue of trust. Even if these backdoors weren’t creating vulnerabilities for bad actors to attack, do we trust the government with our data in the first place?

Our expert panel says, no.

“Handing over such backdoor access to the government would also require an extraordinary degree of trust,” says Khandelwal, “but data breaches like OPM [Office of Personnel Management] proved that government agencies cannot be trusted to keep these backdoor keys safe from hackers.”

Hendren adds:

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

Compromise, in this case, is a rather contentious point of view. The compromise the government seeks isn’t a compromise at all; it’s a major loss of privacy and security as it relates to citizens around the globe.

Clulely shows us how this “compromise” might play out.

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

The third option is to keep things as they are, or further expound on the efforts to secure the internet.

There really is no middle ground in this debate.

Would providing a backdoor help to fight terrorism?

Since fighting terrorism is the narrative in which the government is using to attempt to stamp out encryption, you have to wonder if encryption is truly the thorn in its side that government officials claim it to be.

It’s well-known that ISIS is using encrypted chat apps, like Telegram, to plan attacks and communicate without detection, but would a backdoor have any effect on the surveillance or capture of extremists?

“A backdoor would also have a limited window of efficacy– bad guys would avoid it once that cat is out of the bag,” says Hendren. “This would have the effect of pushing bad actors further down into more subtle and unconventional ways of communication that counter-terrorists might not be aware of, or watching out for, lessening our visibility overall.”

Khandelwal agrees:

There is no ‘compromise’ in encryption debate between Silicon Valley and government leaders

It’s naive to believe that an extremist group that recruits and grooms new members from the keyboard, not the battlefield, isn’t tech savvy enough to find a new means of communication as current ones become compromised.

The privacy debate isn’t going anywhere. Moreover, the ambient noise created by politicians spouting off about technologies they don’t understand should grow in volume as we near the primaries and then the general election.

What’s clear though, is that this isn’t a debate and that Silicon Valley has no means to compromise.

Without compromise, the government is left with but one recourse, policy. One can only hope that we have leaders and citizens who better understand the need for encryption before that day comes.

In this debate, no one is compromising — or compromised — but the end user.