FBI: Encryption increasing problem

FBI: Encryption increasing problem

The FBI is facing an increasing struggle to access readable information and evidence from digital devices because of default encryption, a senior FBI official told members of Congress at a hearing on digital encryption Tuesday.

Amy Hess said officials encountered passwords in 30 percent of the phones the FBI seized during investigations in the last six months, and investigators have had “no capability” to access information in about 13 percent of the cases.

“We have seen those numbers continue to increase, and clearly that presents us with a challenge,” said Hess, the executive assistant director of the FBI branch that oversees the development of surveillance technologies.

In her testimony to a subcommittee of the House Energy and Commerce Committee, Hess defended the Justice Department’s use of a still-unidentified third party to break into the locked iPhone used by one of the two San Bernardino, California, attackers. But she said the reliance on an outside entity represented just “one potential solution” and that there’s no “one-size-fits-all” approach for recovering evidence.

Representatives from local law enforcement agencies echoed Hess’s concerns. Thomas Galati, chief of the intelligence bureau at the New York Police Department, said officials there have been unable to break open 67 Apple devices for use in 44 different investigations of violent crime — including 10 homicide cases.

Still, despite anxieties over “going dark,” a February report from the Berkman Center for Internet and Society at Harvard University said the situation was not as dire as law enforcement had described and that investigators were not “headed to a future in which our ability to effectively surveil criminals and bad actors is impossible.”

The hearing comes amid an ongoing dispute between law enforcement and Silicon Valley about how to balance consumer privacy against the need for police and federal agents to recover communications and eavesdrop on suspected terrorists and criminals. The Senate is considering a bill that would effectively prohibit unbreakable encryption and require companies to help the government access data on a computer or mobile device when a warrant is issued. Bruce Sewell, Apple’s general counsel, touted the importance of encryption.

“The best way we, and the technology industry, know how to protect your information is through the use of strong encryption. Strong encryption is a good thing, a necessary thing. And the government agrees,” Sewell testified.

How Apple makes encryption easy and invisible

How Apple makes encryption easy and invisible

Do you know how many times a day you unlock your iPhone? Every time you do, you’re participating in Apple’s user-friendly encryption scheme.

Friday, the company hosted a security “deep dive” at which it shared some interesting numbers about its security measures and philosophy as well as user habits. To be honest, we’re less concerned with how Apple’s standards work than the fact that they do and will continue to. But that’s kind of the point behind the whole system — Apple designed its encryption system so that we don’t even have to think about it.

Apple’s encryption and security protocols have faced a ton of scrutiny during its recent showdown with the government. And if anything, that debate has gotten more people thinking seriously about how data can and should be secured. And the topic is not going away for a while.

We weren’t there Friday, but Ben Bajarin from Techpinions offers some great analysis, and his piece includes some really cool stats. For one, Apple says that the average user unlocks their phone 80 times a day. We don’t know if that’s across all platforms or just iOS. It sounds a little low in my case, however, because I’m generally pretty fidgety.

But because people are checking their phones so often, it’s important for Apple developers to make encryption powerful without causing the end user frustration. Like if they could just plunk their thumb down, and their phone would unlock, for example.

89 percent of people who own Touch ID-enabled devices use the feature, Apple says. And that’s a really impressive adoption rate, but it makes sense when you think about how much easier the biometric system is to use than a passcode.

Passcodes are great, of course, and you have to have one. But as an experiment a while ago, I turned off Touch ID and went numbers-only to unlock my phone. And guess what? It was really annoying. I switched the feature back on by the end of the day.

Apple also talked up its so-called Secure Enclave, which is its slightly intimidating name for the single co-processor that has handled all encryption for its devices since the iPhone 5s. Each Enclave has its own, unique ID that it uses to scramble up all of the other data for safekeeping. And neither Apple nor other parts of your phone know what that UID is; it all just happens on its own. And that’s pretty much how we prefer it.

Apple, FBI set to resume encryption fight at House hearing

The encryption battle between Apple and the FBI is moving from the courtroom to Congress next week.

Representatives from the tech titan and the federal law enforcement agency are scheduled to testify Tuesday before the House Energy and Commerce Committee about the debate over how the use of encryption in tech products and services hampers law enforcement activities.

In February, Apple clashed with the FBI over whether the company would help investigators hack into the encrypted iPhone of San Bernardino shooter Syed Farook. That case ended when the FBI said it had found a way to unlock the phone without Apple’s help. The debate, however, is unresolved.

Technology companies and rights groups argue that strong encryption, which scrambles data so it can be read only by the right person, is needed to keep people safe and protect privacy. Law enforcement argues it can’t fight crimes unless it has access to information on mobile devices.

The hearing, called “Deciphering the Debate Over Encryption: Industry and Law Enforcement Perspectives,” will include two panels. The first features Amy Hess, executive assistant director for science and technology at the FBI, who will speak about law enforcement concerns along with other law enforcement officials from around the country. Apple general counsel Bruce Sewell will speak during a second panel, which will feature computer science and security professionals.

The FBI and Apple did not immediately respond to requests for comment on their testimony.

The hearing’s agenda comes just a day after a US Senate encryption bill was released that would give law enforcement and government investigators access to encrypted devices and communications. Authored by US Sens. Dianne Feinstein and Richard Burr, the bill furthers a fight that pits national security against cybersecurity.

Earlier this month, Facebook complicated things a bit further for the FBI when it announced that all communications sent on its popular WhatsApp messaging app are now encrypted.

Feinstein encryption bill sets off alarm bells

Feinstein encryption bill sets off alarm bells

A draft version of a long-awaited encryption bill from Sens. Dianne Feinstein, D-Calif., and Richard Burr, R-N.C., was leaked online last week, and the technology industry is already calling foul.

The bill requires any company that receives a court order for information or data to “provide such information or data to such government in an intelligible format” or to “provide such technical assistance as is necessary to obtain such information or data in an intelligible format.” It doesn’t specify the terms under which a company would be forced to help, or what the parameters of “intelligible” are.

The lack of these boundaries is one of the reasons why the backlash to the bill — which isn’t even finished — has been so fast and overwhelming. Kevin Bankston, director of the Open Technology Institute, called it “easily the most ludicrous, dangerous, technically illiterate proposal I’ve ever seen.”

It’s disheartening that the senators intend to continue pressing on with this bill, especially in light of the FBI’s recent bullying of Apple. After the FBI bungled its handling of the San Bernardino shooter’s phone, it tried and failed to force Apple into creating a new program that would let it hack into not just the shooter’s phone but probably many other phones as well. When Apple resisted, the FBI mysteriously came up with a workaround. Small wonder other technology companies are reacting poorly to this Senate bill.

Feinstein’s staffers said that the issue is larger than one phone. That’s true — and it’s exactly why such a broad proposal should make everyone who uses a smartphone uneasy. Giving law enforcement such a broad mandate would inevitably lead to questionable decisions, and it would weaken Internet security for everyone.

Feinstein’s staff also said that the reason for the bill’s vagueness is that the goal is simply to clarify law, not to set a strict method for companies or to tell the court what the penalties should be should companies choose not to follow orders. That sounds good in theory. In practice, Feinstein and Burr would be well-advised to go back to the table with technology interests — and really listen to their concerns.

“Petya” ransomware encryption cracked

"Petya" ransomware encryption cracked

Utility generates unscrambling key.

Users whose data has been held to ransom by the Petya malware now have an option to decrypt the information, thanks to a new tool that generates an unscrambling key.

Petya appeared around March this year. Once executed with Windows administrator privileges, Petya rewrites the master boot record on the computer’s hard drive, crashes the operating system and on restart, scrambles the data on the disk while masquerading as the CHKDSK file consistency utility.

The Petya attackers then demand approximately A$555 in ransom, payable in BItcoin, to provide a decryption key for the locked system.

An anonymous security researcher using the Twitter handle leo_and_stone has now cracked the encryption Petya uses, the Salsa10 function created by DJ Bernstein in 2004.

Decrypting hard disks scrambled with Petya using the tool is a relatively complex operation. The tool requires data from an eight-byte nonce (random, use-once number) file and a 512-byte sector from the hard disk to be input into a website to generate the the decryption key.

This means the Petya-infected hard drive has to be removed from the victim computer, and the small amount of data needed for the decryptor read and copied with low-level system utilities.

Once that is done, the scrambled hard drive has to be reinserted into a computer to bring up the Petya ransom demand screen, at which stage the decryption key can be entered.

Tech support site Bleeping Computer, run by computer forensics specialist Lawrence Abrams, reported success with Leo Stone’s Petya decryptor, with keys being generated in just seconds.

A Windows tool to make it easier to extract the verification data and nonce was also created by researcher Fabian Wosar from security vendor Emsisoft.

WhatsApp’s encryption services are legal for now, but maybe not for long

WhatsApp's encryption services are legal for now, but maybe not for long

WhatsApp introduced end-to-end encryption for all its services today. This means that all user calls, texts, video, images and other files sent can only be viewed by the intended recipient, and no one, not even WhatsApp itself, can access this data. This guarantee of user privacy creates new concerns for the government.

WhatsApp will now find it impossible to comply with government requests for data, since WhatsApp itself will not have the decryption key. In effect, WhatsApp is doing exactly what Apple did in the Apple vs FBI battle; it’s preventing government access to data, but on a much larger scale. While Apple restricted access to users of iPhones only, now practically every user of WhatsApp on any device is protected. 51% of all users of internet messaging services in India use WhatsApp, with a total number of over 70 million users (Source: TRAI’s OTT Consultation Paper, dated March 2015). WhatsApp has now prevented government access to the messages and calls of at least 70 million Indian users.

No encryption requirements are applicable on OTTs like WhatsApp

Telecom service providers and internet service providers, like Airtel and Vodafone, have to obtain a license from the Department of Telecommunications in order to be able to provide such services in India. This license includes several restrictions, including license fees, ensuring emergency services, confidentiality of customer information and requirements for lawful interception, monitoring and the security of the network. These include encryption requirements.

For example, the ‘License Agreement for Provision of Internet Service (Including Internet Telephony)’ for internet service providers (like Reliance and Airtel), permits the usage of up to 40-bit encryption. To employ a higher encryption standard, permission will have to be acquired and a decryption key deposited with the Telecom Authority.

Apps like WhatsApp, Skype and Viber are, however, neither telecom service providers nor internet service providers. These are known as ‘Over-The-Top Services’, or OTTs. Currently, OTTs are not regulated and as such, there are no encryption requirements, nor are there any other requirements in the name of security which these have to comply with.

The Telecom Regulatory Authority of India came out with an OTT Consultation Paper in 2015. Discussions on the paper are closed, but TRAI is yet to issue regulations on the matter. In the absence of any regulations at present, it’s clear that WhatsApp’s new end-to-end encryption policy is perfectly legal, even though it presents a new dilemma for the government.

Impact of end-to-end encryption on proposed regulatory system

Other countries have adopted various approaches to resolve the issue of OTT services. For example, in France, Skype was made to register as a telecom operator. In Germany, Voice-Over-IP is subject to the same requirements as other telecom services because of the technology neutral approach of its Telecommunications Act. In China, VOIP calls have a separate regulatory system under the head of ‘voice based calls’. These systems will make voice-over-IP subject to the same security requirements as telecom providers. For the most part however, OTT services are unregulated abroad as well.

In a detailed discussion on the issue in TRAI’s OTT Consultation Paper, TRAI notes that OTT services circumvent all regulatory requirements by providing services which are otherwise available only through a license. It has suggested the classification of OTT services either as a communication service provider or an application service provider, and to impose similar regulatory requirements as on telecom service providers.

The proposed licensing requirements include enabling ‘lawful interception’. It can be assumed that the provisions will be along the lines of those imposed on telecom regulatory requirements. Given that a 40-bit encryption system is a much lower standard than that used by WhatsApp and also considering that WhatsApp doesn’t even possess the decryption key for deposition with the relevant authority, it remains to be seen how the government will gain access to WhatsApp messages.

Liability of WhatsApp to comply with decryption directions under IT Act

WhatsApp, being an intermediary, is expected to comply with directions to intercept, monitor and decrypt information issued under Section 69 of the Information Technology Act, 2000. Complying with such a direction will now be impossible for WhatsApp in view of its end-to-end encryption. Even before the introduction of this, since WhatsApp is not a company based in India, it may have been able to refuse to comply with such directions. In fact, compliance by such companies in regard to data requests from the Indian government has been reported to be very low.

India’s now withdrawn draft encryption policy took the first step towards overcoming these problems and obtaining access. It required service providers, from both India and abroad, which are using encryption technology, to enter into agreements with India in order to be able to provide such services. One essential requirement of these agreements was to comply with data requests as and when they’re made by the government. This will include any interception, monitoring and decryption requests made under Section 69 of the IT Act. Though it was later clarified that WhatsApp is not within the purview of this policy, this indicates the route that may be taken by the government to obtain access. If WhatsApp refuses to comply with such a regime, that would make WhatsApp illegal in India.

End-to-end encryption is not without its drawbacks. The high, unbreachable level of security and privacy available is in favour of users and against governments. It will make such systems the favorite for illegal activities as well. For example, tracing voice calls made by terrorists using Voice-Over-IP is extremely difficult because of its routing over fake networks. The issue raised in the Apple vs FBI case was also the same, whether an individual user’s privacy can be compromised in favour of the larger public interest. A balance between the two is needed, maintaining user privacy and allowing interception for lawful purposes is required.

Brooklyn case takes front seat in Apple encryption fight

Brooklyn case takes front seat in Apple encryption fight

The Justice Department said Friday it will continue trying to force Apple to reveal an iPhone’s data in a New York drug case, putting the Brooklyn case at the center of a fight over whether a 227-year-old law gives officials wide authority to force a technology company to help in criminal probes.

The government told U.S. District Judge Margo K. Brodie in Brooklyn that it still wants an order requiring Apple’s cooperation in the drug case even though it recently dropped its fight to compel Apple to help it break into an iPhone used by a gunman in a December attack in San Bernardino that killed 14 people.

“The government’s application is not moot and the government continues to require Apple’s assistance in accessing the data that it is authorized to search by warrant,” the Justice Department said in a one-paragraph letter to Brodie.

Apple expressed disappointment, saying its lawyers will press the question of whether the FBI has tried any other means to get into the phone in Brooklyn.

Apple had sought to delay the Brooklyn case, saying that the same technique the FBI was using to get information from the phone in California might work with the drug case phone, eliminating the need for additional litigation.

Federal prosecutors told Brodie on Friday that it would not modify their March request for her to overturn a February ruling concluding that the centuries-old All Writs Act could not be used to force Apple to help the government extract information from iPhones.

Magistrate Judge James Orenstein made the ruling after inviting Apple to challenge the 1789 law, saying he wanted to know if the government requests had created a burden for the Cupertino, California-based company.

Since then, lawyers say Apple has opposed requests to help extract information from over a dozen iPhones in California, Illinois, Massachusetts and New York.

In challenging Orenstein’s ruling, the government said the jurist had overstepped his powers, creating “an unprecedented limitation on” judicial authority.

It said it did not have adequate alternatives to obtaining Apple’s assistance in the Brooklyn case, which involves a phone with a different version of operating system than the phone at issue in the California case.

In a statement Friday, Justice Department spokeswoman Emily Pierce said the mechanism used to gain access in the San Bernardino case can only be used on a narrow category of phones.

“In this case, we still need Apple’s help in accessing the data, which they have done with little effort in at least 70 other cases when presented with court orders for comparable phones running iOS 7 or earlier operating systems,” she said.

Apple is due to file a response in the case by Thursday.

How to encrypt iPhone and Android, and why you should do it now

How to encrypt iPhone and Android, and why you should do it now

Apple’s fight with the FBI may be over for the time being, but this high-profile fight about user privacy and state security may have puzzled some smartphone users. When is an iPhone or Android device encrypted? And how does one go about securing the data on them?

iPhone

It’s pretty simple actually: as long as you set up a password or PIN for the iPhone or iPad’s lockscreen, the device is encrypted. Without knowing the access code, nobody can unlock it, which means your personal data including photos, messages, mail, calendar, contacts, and data from other apps, is secured. Sure the FBI can crack some iPhones, but only if they’re included in criminal investigations, and only if the recent hacks work on all iPhones out there.

If you don’t use a lockscreen password, you should do it right away. Go to Settings, Touch ID & Passcode, tap on Turn Passcode On and enter a strong passcode or password.

Android

As CNET points out, things are a bit more complicated on Android.

The newer the device, the easier it is to get it done. In this category, we have Nexus devices, the Galaxy S7 series, and other new handsets that ship with Android 6.0 preloaded. Just like with the iPhone, go to the Settings app to enable a security lock for the screen, and the phone is encrypted.

With older devices, the encryption procedure is a bit more complex, as you’ll also have to encrypt the handset manually. You’ll even have to do it with newer devices, including the Galaxy S6 and Moto X Pure. Go to Settings, then Security then Encrypt phone. While you’re at it, you may want to encrypt your microSD card as well, so data on it can be read on other devices – do it from the Security menu, then Encrypt external SD card. Once that’s done, you will still need to use a password for the lockscreen.

CNET says there are reasons you should consider not encrypting your Android device, like the fact that a device might take a performance hit when encrypted. The performance drop may be barely noticeable on new devices, but older models and low-end handsets could suffer.

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

The FBI insists that encrypted products like the iPhone and encrypted online services will put people in harm’s way, especially in light of the ISIS-connected San Bernardino shooting late last year. That’s why the Bureau has been arguing for encryption backdoors that would be available to law enforcement agencies, and why it looked to coerce Apple to add a backdoor to iOS.

However, extensive reports that show the preparations ISIS made before hitting Paris and Brussels revealed the kind of encrypted products ISIS radicals used to stay in touch with central command. Unsurprisingly, these products are out of the FBI’s jurisdiction, and one in particular was one of the safest encrypted communication products you can find online. In fact,its original developers are suspected to have ties to the criminal underworld.

Telling the inside story of the Paris and Brussels attacks, CNN explains that ISIS cell members used a chat program called Telegram to talk to one another in the moments ahead of the attacks. Using data obtained from official investigations,CNN learned that just hours before the Bataclan theater was hit, one of the attackers had downloaded Telegram on a Samsung smartphone.

Police never recovered communications from the messaging app. Not only is Telegram encrypted end-to-end, but it also has a self destruct setting.

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

Conceived by Russian developers, the app is out of the FBI’s jurisdiction. But Telegram is the least problematic encrypted service for intelligence agencies looking to collect data and connect suspects. CNN also mentions a far more powerful app, one that hasn’t yet been cracked by law enforcement.

TrueCrypt is the app in question. One of the ISIS radicals who was captured by French police in the months leading to the mid-November Paris attacks revealed details about this program.

TrueCrypt resides on a thumb drive and is used to encrypt messages. French citizen and IT expert Reda Hame was instructed to upload the encrypted message to a Turkish file-sharing site. “An English-speaking expert on clandestine communications I met over there had the same password,” Hame told interrogators. “It operated like a dead letter drop.”

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

According to The New York Times, Hame was told not to send the message via email, so as to not generate any metadata that would help intelligence agencies connect him to other terrorists.

The ISIS technician also instructed Hame to transfer TrueCrypt from the USB key to a second unit once he reached Europe. “He told me to copy what was on the key and then throw it away,” Hame explained. “That’s what I did when I reached Prague.”

Hame made a long journey home from Turkey, making it look like he was a tourist visiting various cities in Europe. Whenever he reached a new place, he was to call a special number belonging to one of the masterminds behind the attacks, and he used a local SIM card to mark his location.

Forget iPhone encryption, the FBI can’t legally touch the software ISIS uses

The Times also mentions a secondary program that was installed on flash drives. Called CCleaner, the program can be used to erase a user’s online history on any computer.

If that’s not enough to show the level of sophistication of these bloody ISIS attacks on Europe and other targets, a story from The New Yorker sheds more light on TrueCrypt, a program whose creators can’t be forced to assist the FBI.

According to the publication, TrueCrypt was launched in 2004 to replace a program called Encryption for the Masses (E4M) developed long before the iPhone existed. Interestingly, the programmer who made it is Paul Le Roux, who also happens to be a dangerous crime lord, having built a global drug, arms and money-laundering cartel out of a base in the Philippines.

E4M is open-source, and so is TrueCrypt, meaning that their creators aren’t companies motivated by a financial interest to keep their security intact.

“TrueCrypt was written by anonymous folks; it could have been Paul Le Roux writing under an assumed name, or it could have been someone completely different,” Johns Hopkins Information Security Institute computer-science professor Matthew Green told The New Yorker.

The developers stopped updating it in 2014 for fear that Le Roux’s decision to cooperate with the DEA might cripple its security. Le Roux was arrested in Liberia on drug-trafficking charges in September 2012. But Green concluded in 2015 that TrueCrypt is still backdoor-free, which explains why ISIS agents still use it.