The encryption challenge

The encryption challenge

IT managers know the movies get it wrong. A teenager with a laptop cannot crack multiple layers of encryption — unless that laptop is connected to a supercomputer somewhere and the teenager can afford to wait a few billion years.

Encryption works. It works so well that even the government gets stymied, as demonstrated by the lengths to which the FBI went to access an iPhone used by one of the San Bernardino, Calif., shooters.

So in the face of ever more damaging stories about data breaches, why aren’t all government agencies encrypting everything, everywhere, all the time?

Encryption can be costly and time consuming. It can also be sabotaged by users and difficult to integrate with legacy applications.

Furthermore, according to a recent 451 Research survey of senior security executives, government agencies seem to be fighting the previous war. Instead of protecting data from hackers who’ve already gotten in, they’re still focusing on keeping the bad guys out of their systems.

Among U.S. government respondents, the top category for increased spending in the next 12 months was network defenses — at 53 percent. By comparison, spending for data-at-rest defenses such as encryption ranked dead last, with just 37 percent planning to increase their spending.

Part of the reason for those figures is that government agencies overestimate the benefits of perimeter defenses. Sixty percent said network defenses were “very” effective, a higher percentage than any other category, while government respondents ranked data-at-rest defenses as less effective than respondents in any other category.

There was a time when that attitude made sense. “Organizations used to say that they wouldn’t encrypt data in their data centers because they’re behind solid walls and require a [password] to get in,” said Steve Pate, chief architect at security firm HyTrust.

That attitude, however, runs counter to the modern reality that there is no longer a perimeter to protect. Every organization uses third-party service providers, offers mobile access or connects to the web — or a combination of all three.

A security audit at the Office of Personnel Management, for example, showed that use of multifactor authentication, such as the government’s own personal identity verification card readers, was not required for remote access to OPM applications. That made it easy for an attacker with a stolen login and password to bypass all perimeter defenses and directly log into the OPM systems.

An over-reliance on perimeter defenses also means that government agencies pay less attention to where their important data is stored than they should.

According to the 451 Research survey, government respondents were among those with the lowest confidence in the security of their sensitive data’s location. Although 50 percent of financial-sector respondents expressed confidence, only 37 percent of government respondents could say the same.

In fact, only 16 percent of all respondents cited “lack of perceived need” as a barrier to adopting data security, but 31 percent — or almost twice as many — government respondents did so.

Earlier this year, the Ponemon Institute released a report showing that 33 percent of government agencies use encryption extensively, compared to 41 percent of companies in general and far behind the financial sector at 56 percent. In that survey of more than 5,000 technology experts, 16 percent of agency respondents said they had no encryption strategy.

On a positive note, the public sector has been making headway. Last year, for example, only 25 percent of government respondents to the Ponemon survey said they were using encryption extensively.

“This is showing heightened interest in data protection,” said Peter Galvin, vice president of strategy at Thales e-Security, which sponsored the Ponemon report. High-profile data breaches have drawn public attention to the issue, he added.

ON ENCRYPTION: THERE ARE NO LOCKS ONLY “ANGELS” CAN OPEN

Despite the FBI dropping its case against Apple over whether or not the tech giant should supply the government agency with the ability to hack into the San Bernardino shooter’s iPhone, the argument over how our devices — especially our phones — should be encrypted continues to rage on. And regardless of how you feel about the issue, almost everybody agrees that the debate can be pretty murky, as privacy vs. protection debates usually are. To make the whole argument a lot easier to digest however, we have one of the web’s best educators and entertainers, CGP Grey, who has broken it all down in one clear five-minute video.

The video, posted above, parallels physical locks with digital “locks” (encryption), noting how they relate and how they differ in order to help us better understand the encryption debate. And one of the most important points that Grey makes about digital locks is that they need to work not only against local threats, but threats from across the globe — threats coming from “internet burglars” and their “burglar bots.”

Grey touches on the scenario in which a bad guy with an armed bomb dies, leaving behind only an encrypted phone with the code to stop the bomb. In this particular case — a parallel to the San Bernardino shooter case, as there may have been information regarding further threats on his encrypted phone — Grey points out that this may be a time when we’d want the police to have access, or a “backdoor,” to the phone. But if companies were forced to build backdoors into their products so government agencies could use them for situations like these, could we trust authorities not to abuse their powers? Could we trust that “demons” (people with bad intentions) wouldn’t hijack the backdoors?

Grey argues that we couldn’t, saying that “there’s no way to build a digital lock that only angels can open and demons cannot.”

There’s also a bonus “footnote” video (below) in which Grey discusses just how intimate the data on our phones has become (do you remember where you were on April 8th at 6:02AM? No? Your phone does).

What do you think about CGP Grey’s breakdown of the encryption debate?

Moot point: Judge closes iPhone encryption case in Brooklyn

1

The United States Justice Department said on Friday that it has withdrawn a request compelling Apple Inc to cooperate in unlocking an iPhone related to a drug case in NY following a third-party providing a passcode to the authorities to access the handset.

“An individual provided the department with the passcode to the locked phone at issue in the Eastern District of New York”, Justice Department spokesman Marc Raimondi said in a statement.

On Friday, the Justice Department told a federal court in Brooklyn that it would withdraw the motion to force Apple to pull data from a drug dealer’s locked iPhone, The Washington Post reported.

Investigators have dropped the court case against Apple as they have successfully gained access to the iPhone 5s involved in the NY drug case.

There are about a dozen other All Writs Act orders for Apple’s assistance with opening for other devices that are unresolved, but are not in active litigation, according to a Justice Department official. Apple, meanwhile, demanded to know in the NY case whether the government had exhausted all other options to get to the data.

The company said it “strongly supports, and will continue to support, the efforts of law enforcement in pursuing criminals”, but not through the government’s misuse of a law it wants to use as a “precedent to lodge future, more onerous requests for Apple’s assistance”.

The case dates back to 2014, when authorities seized the iPhone 5s of the suspect Jun Feng. Feng pleaded guilty in October to conspiring to distribute methamphetamine and is scheduled to be sentenced in June. Comments attributed to Apple’s attorneys also suggest that while the company isn’t aware of the method used, it’s convinced that normal product development is eventually going to plug whatever exploit was used to gain access to that iPhone.

According to the Wall Street Journal, that “individual” is Feng himself, who has already been convicted and only recently became aware that his phone was the subject of a national controversy.

The case began on February 16 with an order from Judge Sheri Pym and ended on March 28 when the Justice Department withdrew its legal actaion against Apple.

As a result, Comey’s remarks strongly implied that the bureau paid at least $1.3 million to get onto the phone, which had belonged to Syed Rizwan Farook, who, with his wife, killed 14 people during the December 2 terror attack in San Bernardino, Calif. 、

FBI: Encryption increasing problem

FBI: Encryption increasing problem

The FBI is facing an increasing struggle to access readable information and evidence from digital devices because of default encryption, a senior FBI official told members of Congress at a hearing on digital encryption Tuesday.

Amy Hess said officials encountered passwords in 30 percent of the phones the FBI seized during investigations in the last six months, and investigators have had “no capability” to access information in about 13 percent of the cases.

“We have seen those numbers continue to increase, and clearly that presents us with a challenge,” said Hess, the executive assistant director of the FBI branch that oversees the development of surveillance technologies.

In her testimony to a subcommittee of the House Energy and Commerce Committee, Hess defended the Justice Department’s use of a still-unidentified third party to break into the locked iPhone used by one of the two San Bernardino, California, attackers. But she said the reliance on an outside entity represented just “one potential solution” and that there’s no “one-size-fits-all” approach for recovering evidence.

Representatives from local law enforcement agencies echoed Hess’s concerns. Thomas Galati, chief of the intelligence bureau at the New York Police Department, said officials there have been unable to break open 67 Apple devices for use in 44 different investigations of violent crime — including 10 homicide cases.

Still, despite anxieties over “going dark,” a February report from the Berkman Center for Internet and Society at Harvard University said the situation was not as dire as law enforcement had described and that investigators were not “headed to a future in which our ability to effectively surveil criminals and bad actors is impossible.”

The hearing comes amid an ongoing dispute between law enforcement and Silicon Valley about how to balance consumer privacy against the need for police and federal agents to recover communications and eavesdrop on suspected terrorists and criminals. The Senate is considering a bill that would effectively prohibit unbreakable encryption and require companies to help the government access data on a computer or mobile device when a warrant is issued. Bruce Sewell, Apple’s general counsel, touted the importance of encryption.

“The best way we, and the technology industry, know how to protect your information is through the use of strong encryption. Strong encryption is a good thing, a necessary thing. And the government agrees,” Sewell testified.

How Apple makes encryption easy and invisible

How Apple makes encryption easy and invisible

Do you know how many times a day you unlock your iPhone? Every time you do, you’re participating in Apple’s user-friendly encryption scheme.

Friday, the company hosted a security “deep dive” at which it shared some interesting numbers about its security measures and philosophy as well as user habits. To be honest, we’re less concerned with how Apple’s standards work than the fact that they do and will continue to. But that’s kind of the point behind the whole system — Apple designed its encryption system so that we don’t even have to think about it.

Apple’s encryption and security protocols have faced a ton of scrutiny during its recent showdown with the government. And if anything, that debate has gotten more people thinking seriously about how data can and should be secured. And the topic is not going away for a while.

We weren’t there Friday, but Ben Bajarin from Techpinions offers some great analysis, and his piece includes some really cool stats. For one, Apple says that the average user unlocks their phone 80 times a day. We don’t know if that’s across all platforms or just iOS. It sounds a little low in my case, however, because I’m generally pretty fidgety.

But because people are checking their phones so often, it’s important for Apple developers to make encryption powerful without causing the end user frustration. Like if they could just plunk their thumb down, and their phone would unlock, for example.

89 percent of people who own Touch ID-enabled devices use the feature, Apple says. And that’s a really impressive adoption rate, but it makes sense when you think about how much easier the biometric system is to use than a passcode.

Passcodes are great, of course, and you have to have one. But as an experiment a while ago, I turned off Touch ID and went numbers-only to unlock my phone. And guess what? It was really annoying. I switched the feature back on by the end of the day.

Apple also talked up its so-called Secure Enclave, which is its slightly intimidating name for the single co-processor that has handled all encryption for its devices since the iPhone 5s. Each Enclave has its own, unique ID that it uses to scramble up all of the other data for safekeeping. And neither Apple nor other parts of your phone know what that UID is; it all just happens on its own. And that’s pretty much how we prefer it.

Apple, FBI set to resume encryption fight at House hearing

The encryption battle between Apple and the FBI is moving from the courtroom to Congress next week.

Representatives from the tech titan and the federal law enforcement agency are scheduled to testify Tuesday before the House Energy and Commerce Committee about the debate over how the use of encryption in tech products and services hampers law enforcement activities.

In February, Apple clashed with the FBI over whether the company would help investigators hack into the encrypted iPhone of San Bernardino shooter Syed Farook. That case ended when the FBI said it had found a way to unlock the phone without Apple’s help. The debate, however, is unresolved.

Technology companies and rights groups argue that strong encryption, which scrambles data so it can be read only by the right person, is needed to keep people safe and protect privacy. Law enforcement argues it can’t fight crimes unless it has access to information on mobile devices.

The hearing, called “Deciphering the Debate Over Encryption: Industry and Law Enforcement Perspectives,” will include two panels. The first features Amy Hess, executive assistant director for science and technology at the FBI, who will speak about law enforcement concerns along with other law enforcement officials from around the country. Apple general counsel Bruce Sewell will speak during a second panel, which will feature computer science and security professionals.

The FBI and Apple did not immediately respond to requests for comment on their testimony.

The hearing’s agenda comes just a day after a US Senate encryption bill was released that would give law enforcement and government investigators access to encrypted devices and communications. Authored by US Sens. Dianne Feinstein and Richard Burr, the bill furthers a fight that pits national security against cybersecurity.

Earlier this month, Facebook complicated things a bit further for the FBI when it announced that all communications sent on its popular WhatsApp messaging app are now encrypted.

Feinstein encryption bill sets off alarm bells

Feinstein encryption bill sets off alarm bells

A draft version of a long-awaited encryption bill from Sens. Dianne Feinstein, D-Calif., and Richard Burr, R-N.C., was leaked online last week, and the technology industry is already calling foul.

The bill requires any company that receives a court order for information or data to “provide such information or data to such government in an intelligible format” or to “provide such technical assistance as is necessary to obtain such information or data in an intelligible format.” It doesn’t specify the terms under which a company would be forced to help, or what the parameters of “intelligible” are.

The lack of these boundaries is one of the reasons why the backlash to the bill — which isn’t even finished — has been so fast and overwhelming. Kevin Bankston, director of the Open Technology Institute, called it “easily the most ludicrous, dangerous, technically illiterate proposal I’ve ever seen.”

It’s disheartening that the senators intend to continue pressing on with this bill, especially in light of the FBI’s recent bullying of Apple. After the FBI bungled its handling of the San Bernardino shooter’s phone, it tried and failed to force Apple into creating a new program that would let it hack into not just the shooter’s phone but probably many other phones as well. When Apple resisted, the FBI mysteriously came up with a workaround. Small wonder other technology companies are reacting poorly to this Senate bill.

Feinstein’s staffers said that the issue is larger than one phone. That’s true — and it’s exactly why such a broad proposal should make everyone who uses a smartphone uneasy. Giving law enforcement such a broad mandate would inevitably lead to questionable decisions, and it would weaken Internet security for everyone.

Feinstein’s staff also said that the reason for the bill’s vagueness is that the goal is simply to clarify law, not to set a strict method for companies or to tell the court what the penalties should be should companies choose not to follow orders. That sounds good in theory. In practice, Feinstein and Burr would be well-advised to go back to the table with technology interests — and really listen to their concerns.

“Petya” ransomware encryption cracked

"Petya" ransomware encryption cracked

Utility generates unscrambling key.

Users whose data has been held to ransom by the Petya malware now have an option to decrypt the information, thanks to a new tool that generates an unscrambling key.

Petya appeared around March this year. Once executed with Windows administrator privileges, Petya rewrites the master boot record on the computer’s hard drive, crashes the operating system and on restart, scrambles the data on the disk while masquerading as the CHKDSK file consistency utility.

The Petya attackers then demand approximately A$555 in ransom, payable in BItcoin, to provide a decryption key for the locked system.

An anonymous security researcher using the Twitter handle leo_and_stone has now cracked the encryption Petya uses, the Salsa10 function created by DJ Bernstein in 2004.

Decrypting hard disks scrambled with Petya using the tool is a relatively complex operation. The tool requires data from an eight-byte nonce (random, use-once number) file and a 512-byte sector from the hard disk to be input into a website to generate the the decryption key.

This means the Petya-infected hard drive has to be removed from the victim computer, and the small amount of data needed for the decryptor read and copied with low-level system utilities.

Once that is done, the scrambled hard drive has to be reinserted into a computer to bring up the Petya ransom demand screen, at which stage the decryption key can be entered.

Tech support site Bleeping Computer, run by computer forensics specialist Lawrence Abrams, reported success with Leo Stone’s Petya decryptor, with keys being generated in just seconds.

A Windows tool to make it easier to extract the verification data and nonce was also created by researcher Fabian Wosar from security vendor Emsisoft.

WhatsApp’s encryption services are legal for now, but maybe not for long

WhatsApp's encryption services are legal for now, but maybe not for long

WhatsApp introduced end-to-end encryption for all its services today. This means that all user calls, texts, video, images and other files sent can only be viewed by the intended recipient, and no one, not even WhatsApp itself, can access this data. This guarantee of user privacy creates new concerns for the government.

WhatsApp will now find it impossible to comply with government requests for data, since WhatsApp itself will not have the decryption key. In effect, WhatsApp is doing exactly what Apple did in the Apple vs FBI battle; it’s preventing government access to data, but on a much larger scale. While Apple restricted access to users of iPhones only, now practically every user of WhatsApp on any device is protected. 51% of all users of internet messaging services in India use WhatsApp, with a total number of over 70 million users (Source: TRAI’s OTT Consultation Paper, dated March 2015). WhatsApp has now prevented government access to the messages and calls of at least 70 million Indian users.

No encryption requirements are applicable on OTTs like WhatsApp

Telecom service providers and internet service providers, like Airtel and Vodafone, have to obtain a license from the Department of Telecommunications in order to be able to provide such services in India. This license includes several restrictions, including license fees, ensuring emergency services, confidentiality of customer information and requirements for lawful interception, monitoring and the security of the network. These include encryption requirements.

For example, the ‘License Agreement for Provision of Internet Service (Including Internet Telephony)’ for internet service providers (like Reliance and Airtel), permits the usage of up to 40-bit encryption. To employ a higher encryption standard, permission will have to be acquired and a decryption key deposited with the Telecom Authority.

Apps like WhatsApp, Skype and Viber are, however, neither telecom service providers nor internet service providers. These are known as ‘Over-The-Top Services’, or OTTs. Currently, OTTs are not regulated and as such, there are no encryption requirements, nor are there any other requirements in the name of security which these have to comply with.

The Telecom Regulatory Authority of India came out with an OTT Consultation Paper in 2015. Discussions on the paper are closed, but TRAI is yet to issue regulations on the matter. In the absence of any regulations at present, it’s clear that WhatsApp’s new end-to-end encryption policy is perfectly legal, even though it presents a new dilemma for the government.

Impact of end-to-end encryption on proposed regulatory system

Other countries have adopted various approaches to resolve the issue of OTT services. For example, in France, Skype was made to register as a telecom operator. In Germany, Voice-Over-IP is subject to the same requirements as other telecom services because of the technology neutral approach of its Telecommunications Act. In China, VOIP calls have a separate regulatory system under the head of ‘voice based calls’. These systems will make voice-over-IP subject to the same security requirements as telecom providers. For the most part however, OTT services are unregulated abroad as well.

In a detailed discussion on the issue in TRAI’s OTT Consultation Paper, TRAI notes that OTT services circumvent all regulatory requirements by providing services which are otherwise available only through a license. It has suggested the classification of OTT services either as a communication service provider or an application service provider, and to impose similar regulatory requirements as on telecom service providers.

The proposed licensing requirements include enabling ‘lawful interception’. It can be assumed that the provisions will be along the lines of those imposed on telecom regulatory requirements. Given that a 40-bit encryption system is a much lower standard than that used by WhatsApp and also considering that WhatsApp doesn’t even possess the decryption key for deposition with the relevant authority, it remains to be seen how the government will gain access to WhatsApp messages.

Liability of WhatsApp to comply with decryption directions under IT Act

WhatsApp, being an intermediary, is expected to comply with directions to intercept, monitor and decrypt information issued under Section 69 of the Information Technology Act, 2000. Complying with such a direction will now be impossible for WhatsApp in view of its end-to-end encryption. Even before the introduction of this, since WhatsApp is not a company based in India, it may have been able to refuse to comply with such directions. In fact, compliance by such companies in regard to data requests from the Indian government has been reported to be very low.

India’s now withdrawn draft encryption policy took the first step towards overcoming these problems and obtaining access. It required service providers, from both India and abroad, which are using encryption technology, to enter into agreements with India in order to be able to provide such services. One essential requirement of these agreements was to comply with data requests as and when they’re made by the government. This will include any interception, monitoring and decryption requests made under Section 69 of the IT Act. Though it was later clarified that WhatsApp is not within the purview of this policy, this indicates the route that may be taken by the government to obtain access. If WhatsApp refuses to comply with such a regime, that would make WhatsApp illegal in India.

End-to-end encryption is not without its drawbacks. The high, unbreachable level of security and privacy available is in favour of users and against governments. It will make such systems the favorite for illegal activities as well. For example, tracing voice calls made by terrorists using Voice-Over-IP is extremely difficult because of its routing over fake networks. The issue raised in the Apple vs FBI case was also the same, whether an individual user’s privacy can be compromised in favour of the larger public interest. A balance between the two is needed, maintaining user privacy and allowing interception for lawful purposes is required.

Brooklyn case takes front seat in Apple encryption fight

Brooklyn case takes front seat in Apple encryption fight

The Justice Department said Friday it will continue trying to force Apple to reveal an iPhone’s data in a New York drug case, putting the Brooklyn case at the center of a fight over whether a 227-year-old law gives officials wide authority to force a technology company to help in criminal probes.

The government told U.S. District Judge Margo K. Brodie in Brooklyn that it still wants an order requiring Apple’s cooperation in the drug case even though it recently dropped its fight to compel Apple to help it break into an iPhone used by a gunman in a December attack in San Bernardino that killed 14 people.

“The government’s application is not moot and the government continues to require Apple’s assistance in accessing the data that it is authorized to search by warrant,” the Justice Department said in a one-paragraph letter to Brodie.

Apple expressed disappointment, saying its lawyers will press the question of whether the FBI has tried any other means to get into the phone in Brooklyn.

Apple had sought to delay the Brooklyn case, saying that the same technique the FBI was using to get information from the phone in California might work with the drug case phone, eliminating the need for additional litigation.

Federal prosecutors told Brodie on Friday that it would not modify their March request for her to overturn a February ruling concluding that the centuries-old All Writs Act could not be used to force Apple to help the government extract information from iPhones.

Magistrate Judge James Orenstein made the ruling after inviting Apple to challenge the 1789 law, saying he wanted to know if the government requests had created a burden for the Cupertino, California-based company.

Since then, lawyers say Apple has opposed requests to help extract information from over a dozen iPhones in California, Illinois, Massachusetts and New York.

In challenging Orenstein’s ruling, the government said the jurist had overstepped his powers, creating “an unprecedented limitation on” judicial authority.

It said it did not have adequate alternatives to obtaining Apple’s assistance in the Brooklyn case, which involves a phone with a different version of operating system than the phone at issue in the California case.

In a statement Friday, Justice Department spokeswoman Emily Pierce said the mechanism used to gain access in the San Bernardino case can only be used on a narrow category of phones.

“In this case, we still need Apple’s help in accessing the data, which they have done with little effort in at least 70 other cases when presented with court orders for comparable phones running iOS 7 or earlier operating systems,” she said.

Apple is due to file a response in the case by Thursday.