WhatsApp’s encryption services are legal for now, but maybe not for long

WhatsApp's encryption services are legal for now, but maybe not for long

WhatsApp introduced end-to-end encryption for all its services today. This means that all user calls, texts, video, images and other files sent can only be viewed by the intended recipient, and no one, not even WhatsApp itself, can access this data. This guarantee of user privacy creates new concerns for the government.

WhatsApp will now find it impossible to comply with government requests for data, since WhatsApp itself will not have the decryption key. In effect, WhatsApp is doing exactly what Apple did in the Apple vs FBI battle; it’s preventing government access to data, but on a much larger scale. While Apple restricted access to users of iPhones only, now practically every user of WhatsApp on any device is protected. 51% of all users of internet messaging services in India use WhatsApp, with a total number of over 70 million users (Source: TRAI’s OTT Consultation Paper, dated March 2015). WhatsApp has now prevented government access to the messages and calls of at least 70 million Indian users.

No encryption requirements are applicable on OTTs like WhatsApp

Telecom service providers and internet service providers, like Airtel and Vodafone, have to obtain a license from the Department of Telecommunications in order to be able to provide such services in India. This license includes several restrictions, including license fees, ensuring emergency services, confidentiality of customer information and requirements for lawful interception, monitoring and the security of the network. These include encryption requirements.

For example, the ‘License Agreement for Provision of Internet Service (Including Internet Telephony)’ for internet service providers (like Reliance and Airtel), permits the usage of up to 40-bit encryption. To employ a higher encryption standard, permission will have to be acquired and a decryption key deposited with the Telecom Authority.

Apps like WhatsApp, Skype and Viber are, however, neither telecom service providers nor internet service providers. These are known as ‘Over-The-Top Services’, or OTTs. Currently, OTTs are not regulated and as such, there are no encryption requirements, nor are there any other requirements in the name of security which these have to comply with.

The Telecom Regulatory Authority of India came out with an OTT Consultation Paper in 2015. Discussions on the paper are closed, but TRAI is yet to issue regulations on the matter. In the absence of any regulations at present, it’s clear that WhatsApp’s new end-to-end encryption policy is perfectly legal, even though it presents a new dilemma for the government.

Impact of end-to-end encryption on proposed regulatory system

Other countries have adopted various approaches to resolve the issue of OTT services. For example, in France, Skype was made to register as a telecom operator. In Germany, Voice-Over-IP is subject to the same requirements as other telecom services because of the technology neutral approach of its Telecommunications Act. In China, VOIP calls have a separate regulatory system under the head of ‘voice based calls’. These systems will make voice-over-IP subject to the same security requirements as telecom providers. For the most part however, OTT services are unregulated abroad as well.

In a detailed discussion on the issue in TRAI’s OTT Consultation Paper, TRAI notes that OTT services circumvent all regulatory requirements by providing services which are otherwise available only through a license. It has suggested the classification of OTT services either as a communication service provider or an application service provider, and to impose similar regulatory requirements as on telecom service providers.

The proposed licensing requirements include enabling ‘lawful interception’. It can be assumed that the provisions will be along the lines of those imposed on telecom regulatory requirements. Given that a 40-bit encryption system is a much lower standard than that used by WhatsApp and also considering that WhatsApp doesn’t even possess the decryption key for deposition with the relevant authority, it remains to be seen how the government will gain access to WhatsApp messages.

Liability of WhatsApp to comply with decryption directions under IT Act

WhatsApp, being an intermediary, is expected to comply with directions to intercept, monitor and decrypt information issued under Section 69 of the Information Technology Act, 2000. Complying with such a direction will now be impossible for WhatsApp in view of its end-to-end encryption. Even before the introduction of this, since WhatsApp is not a company based in India, it may have been able to refuse to comply with such directions. In fact, compliance by such companies in regard to data requests from the Indian government has been reported to be very low.

India’s now withdrawn draft encryption policy took the first step towards overcoming these problems and obtaining access. It required service providers, from both India and abroad, which are using encryption technology, to enter into agreements with India in order to be able to provide such services. One essential requirement of these agreements was to comply with data requests as and when they’re made by the government. This will include any interception, monitoring and decryption requests made under Section 69 of the IT Act. Though it was later clarified that WhatsApp is not within the purview of this policy, this indicates the route that may be taken by the government to obtain access. If WhatsApp refuses to comply with such a regime, that would make WhatsApp illegal in India.

End-to-end encryption is not without its drawbacks. The high, unbreachable level of security and privacy available is in favour of users and against governments. It will make such systems the favorite for illegal activities as well. For example, tracing voice calls made by terrorists using Voice-Over-IP is extremely difficult because of its routing over fake networks. The issue raised in the Apple vs FBI case was also the same, whether an individual user’s privacy can be compromised in favour of the larger public interest. A balance between the two is needed, maintaining user privacy and allowing interception for lawful purposes is required.

Brooklyn case takes front seat in Apple encryption fight

Brooklyn case takes front seat in Apple encryption fight

The Justice Department said Friday it will continue trying to force Apple to reveal an iPhone’s data in a New York drug case, putting the Brooklyn case at the center of a fight over whether a 227-year-old law gives officials wide authority to force a technology company to help in criminal probes.

The government told U.S. District Judge Margo K. Brodie in Brooklyn that it still wants an order requiring Apple’s cooperation in the drug case even though it recently dropped its fight to compel Apple to help it break into an iPhone used by a gunman in a December attack in San Bernardino that killed 14 people.

“The government’s application is not moot and the government continues to require Apple’s assistance in accessing the data that it is authorized to search by warrant,” the Justice Department said in a one-paragraph letter to Brodie.

Apple expressed disappointment, saying its lawyers will press the question of whether the FBI has tried any other means to get into the phone in Brooklyn.

Apple had sought to delay the Brooklyn case, saying that the same technique the FBI was using to get information from the phone in California might work with the drug case phone, eliminating the need for additional litigation.

Federal prosecutors told Brodie on Friday that it would not modify their March request for her to overturn a February ruling concluding that the centuries-old All Writs Act could not be used to force Apple to help the government extract information from iPhones.

Magistrate Judge James Orenstein made the ruling after inviting Apple to challenge the 1789 law, saying he wanted to know if the government requests had created a burden for the Cupertino, California-based company.

Since then, lawyers say Apple has opposed requests to help extract information from over a dozen iPhones in California, Illinois, Massachusetts and New York.

In challenging Orenstein’s ruling, the government said the jurist had overstepped his powers, creating “an unprecedented limitation on” judicial authority.

It said it did not have adequate alternatives to obtaining Apple’s assistance in the Brooklyn case, which involves a phone with a different version of operating system than the phone at issue in the California case.

In a statement Friday, Justice Department spokeswoman Emily Pierce said the mechanism used to gain access in the San Bernardino case can only be used on a narrow category of phones.

“In this case, we still need Apple’s help in accessing the data, which they have done with little effort in at least 70 other cases when presented with court orders for comparable phones running iOS 7 or earlier operating systems,” she said.

Apple is due to file a response in the case by Thursday.

How the FBI Cracked the iPhone Encryption and Averted a Legal Showdown With Apple

How the FBI Cracked the iPhone Encryption and Averted a Legal Showdown With Apple

An urgent meeting inside FBI headquarters little more than a week ago is what convinced federal law enforcement officials that they may be able to abandon a brewing legal fight with tech giant Apple, sources told ABC News today.

In the days after the December 2015 massacre in San Bernardino, California, which killed 14 people and wounded 22 others, the iPhone left behind by one of the shooters, Syed Farook, was secretly flown to the FBI’s laboratory in Quantico, Virginia, sources said.

The FBI had been unable to review the phone’s contents due to a security feature that — after 10 failed attempts to enter the 4-digit access code — would render the phone’s files forever inaccessible.

By last month, the FBI was at an impasse with Apple, which fought a court order telling the company to help authorities bypass the security feature. Apple maintained the U.S. government was asking it to create a “backdoor” into its devices that would endanger the privacy of hundreds of millions of iPhone users around the world.

“It is in our view the software equivalent of cancer,” Apple CEO Tim Cook recently told “World News Tonight” anchor David Muir.

But the FBI insisted it had a responsibility to access any data potentially relevant to the deadly terror attack in San Bernardino.

“I don’t know whether there is evidence of the identity of another terrorist on the phone, or nothing at all. But we ought to be fired in the FBI if we didn’t pursue that lead,” FBI Director James Comey told a House panel in February.

As the legal battle played out, the FBI appealed to cyber experts around the world for help.

“We’ve talked to anybody who will talk with us about it, and I welcome additional suggestions,” Comey said during a House hearing four weeks ago.

In response, countless companies and hackers — including what one source familiar with matter called many “whackadoodles” — came forward claiming to have a way into Farook’s phone, sources said.

But nothing appeared viable. That is, until a company that the FBI has yet to identify came forward about two weeks ago. After initial contacts with the FBI, company officials flew to Washington to lay out their solution, sources told ABC News.

On Sunday, March 20, in a meeting at FBI headquarters, company officials demonstrated their technology on another iPhone. Convinced it would work, the FBI greenlighted applying it to Farook’s phone, sources said.

This past weekend — just days ago — the attempt was made, and “the FBI has now successfully retrieved the data stored on” the phone, according to the Justice Department.

Forensic examiners are now attempting to exploit potential evidence from the phone. It’s unclear if anything of investigative value has been found yet.

The FBI has refused to identify the company that offered the solution, with one source citing a “mutual agreement.” Nevertheless, Apple did not play a part in finding the solution, company officials said.

As for whether the solution might be shared with Apple, it’s a decision that will be made through consultation with multiple federal agencies, sources said.

One federal law enforcement source said it’s important to emphasize that the ultimate solution identified in this case was not found despite the lawsuit filed against Apple, but because of it.

The solution was “generated as a result of the media attention,” the source said.

At the same time, the source said federal authorities believe the end to the current litigation should not end the national discussion about balancing the interests of security and privacy.

“Our need for public safety and our need for privacy are crashing into each other, and we have to sort that out as a people,” Comey said recently. “This world some people imagine where nobody can look at your stuff is a world that will have public safety costs.”

FBI Hacks iPhone, Ending Apple Encryption Challenge

FBI Hacks iPhone, Ending Apple Encryption Challenge

The Department of Justice said in a federal court filing Monday that it had bypassed encryption on the iPhone 5c used by a terrorist in a mass shooting last year in California and requested the court vacate its order compelling Apple to assist it in accessing the device.

The filing effectively ends a contentious legal battle between the federal government and Apple over the phone used by Syed Rizwan Farook. Farook was fatally shot by authorities along with his wife, Tashfeen Malik, after they killed 14 people in San Bernardino, California, in December.

“The government has now successfully accessed the data stored on Farook’s iPhone and therefore no longer requires the assistance from Apple Inc. mandated by Court’s Order Compelling Apple Inc. to Assist Agents in Search dated February 16, 2016,” government lawyers said in their filing in U.S. District Court for the Central District of California.

The two-page filing contains no information about the methods the government used to bypass the phone’s encryption.

A scheduled March 22 hearing was canceled last week after government lawyers said an “outside party” had proposed a possible way to unlock the phone that would not require Apple’s help. The tech giant had vowed to oppose the order in court, stating that helping the government access an encrypted iPhone would set a precedent for undermining privacy and cybersecurity.

“Our decision to conclude the litigation was based solely on the fact that, with the recent assistance of a third party, we are now able to unlock that iPhone without compromising any information on the phone,” prosecutors said in a statement.

“We sought an order compelling Apple to help unlock the phone to fulfill a solemn commitment to the victims of the San Bernardino shooting – that we will not rest until we have fully pursued every investigative lead related to the vicious attack,” the statement said. “Although this step in the investigation is now complete, we will continue to explore every lead, and seek any appropriate legal process, to ensure our investigation collects all of the evidence related to this terrorist attack. The San Bernardino victims deserve nothing less.”

Why few hackers are lining up to help FBI crack iPhone encryption

Why few hackers are lining up to help FBI crack iPhone encryption

When the FBI said it couldn’t unlock the iPhone at the center of the San Bernardino shooting investigation without the help of Apple, the hackers at DriveSavers Data Recovery took it as a challenge.

Almost 200 man hours and one destroyed iPhone later, the Bay Area company has yet to prove the FBI wrong. But an Israeli digital forensics firm reportedly has, and the FBI is testing the method.

Finding a solution to such a high-profile problem would be a major feat — with publicity, job offers and a big payday on the line. But, in fact, the specialists at DriveSavers are among only a few U.S. hackers trying to solve it. Wary of the stigma of working with the FBI, many established hackers, who can be paid handsomely by tech firms for identifying flaws, say assisting the investigation would violate their industry’s core principles.

Some American security experts say they would never help the FBI, others waver in their willingness to do so. And not all of those who would consider helping want their involvement publicized for risk of being labeled the hacker who unhinged a backdoor to millions of iPhones.

“The FBI has done such a horrible job of managing this process that anybody in the hacking community, the security community or the general public who would openly work with them would be viewed as helping the bad guys,” said Adriel Desautels, chief executive of cybersecurity testing company Netragard. “It would very likely be a serious PR nightmare.”

Much of the security industry’s frustration with the FBI stems from the agency’s insistence that Apple compromise its own security. The fact that the FBI is now leaning on outside help bolsters the security industry’s belief that, given enough time and funding, investigators could find a workaround — suggesting the agency’s legal tactics had more to do with setting a precedent than cracking the iPhone 5c owned by gunman Syed Rizwan Farook.

Some like Mike Cobb, the director of engineering at DriveSavers in Novato, Calif., wanted to be the first to find a way in. Doing so could bring rewards, including new contracts and, if desired, free marketing.

“The bragging rights, the technical prowess, are going to be considerable and enhanced by the fact that it’s a very powerful case in the press,” said Shane McGee, chief privacy officer for cybersecurity software maker FireEye Inc.

Altruism could motivate others. Helping the FBI could further an inquiry into how a husband-and-wife couple managed to gun down 14 people, wound many others and briefly get away.

Another positive, McGee said, is that legal liability is low: While unauthorized tampering with gadgets has led to prison time, it’s legal as long as people meddle with iPhones they own — and the court order helps too.

But top security experts doubt the benefits are worth the risk of being seen as a black sheep within their community.

Hackers have said they don’t want to touch the San Bernardino case “with a 10-foot pole because the FBI doesn’t look the like good guy and frankly isn’t in the right asking Apple to put a back door into their program,” Desautels said. The assisting party, if ever identified, could face backlash from privacy advocates and civil liberties activists.

“They’d be tainted,” Desautels said.

The unease in the hacker community can be seen through Nicholas Allegra, a well-known iPhone hacker who most recently worked for Citrix.

Concerned an FBI victory in its legal fight with Apple would embolden authorities to force more companies to develop software at the government’s behest, Allegra had dabbled in finding a crack in iPhone 5c security. If successful, he hoped his findings would lead the FBI to drop the Apple dispute.

But he has left the project on the back burner, concerned that if he found a solution, law enforcement would use it beyond the San Bernardino case.

“I put in some work. I could have put more in,” he said. But “I wasn’t sure if I even wanted to.”

Companies including Microsoft, United Airlines and Uber encourage researchers and even hackers to target them and report problems by dangling cash rewards.

HackerOne, an intermediary for many of the companies, has collectively paid $6 million to more than 2,300 people since 2013. Boutique firms and freelancers can earn a living between such bounties and occasionally selling newly discovered hacking tools to governments or malicious hackers.

But Apple doesn’t have a bounty program, removing another incentive for tinkering with the iPhone 5c.

Why few hackers are lining up to help FBI crack iPhone encryption

Still, Israeli firm Cellebrite is said to have attempted and succeeded at defeating the device’s security measures.

The company, whose technology is heavily used by law enforcement agencies worldwide to extract and analyze data from phones, declined to comment. The FBI has said only that an “outside party” presented a new idea Sunday night that will take about two weeks to verify. Apple officials said they aren’t aware of the details.

Going to the FBI before going to the company would violate standard practice in the hacking community. Security researchers almost always warn manufacturers about problems in their products and services before sharing details with anyone else. It provides time for a issuing a fix before a malicious party can exploit it.

“We’ve never disclosed something to the government ahead of the company that distributed the hardware or software,” McGee said. “There could be far-reaching consequences.”

Another drawback is that an iPhone 5c vulnerability isn’t considered a hot commodity in the minds of many hackers, who seek to one-up each other by attacking newer, more widely used products. The 5c model went on sale in 2013 and lacks a fingerprint sensor. Newer iPhones are more powerful and have different security built into them. Only if the hack could be applied to contemporary iPhones would it be worth a rare $1-million bounty, experts say.

The limited scope of this case is why many hackers were taken back by a court order asking for what they consider broadly applicable software to switch off several security measures. Instead, experts wanted the FBI to invest in going after the gunman’s specific phone with more creativity. In other words, attack the problem with technology, not the courts.

“If you have access to the hardware and you have the ability to dismantle the phone, the methodology doesn’t seem like it would be all that complex,” Desautels said.

Two years ago, his team tried to extract data from an iPad at the request of a financial services company that wanted to test the security of the tablets before offering them to employees. Netragard’s researcher failed after almost a month; he accidentally triggered a date change within the software that rendered the iPad unusable. But Desautels said cracking the iPad would have been “possible and trivial” for someone with more time and a dozen iPads to mess with.

The same, he imagines, would be true for an iPhone. The FBI, though, has said it had exhausted all known possibilities.

Taking Apple to court generated attention about the problem and “stimulated creative people around the world to see what they might be able to do,” FBI Director James Comey said in a letter to the Wall Street Journal editorial board Wednesday. Not “all technical creativity” resides within government, he said.

The plea worked, grabbing the interest of companies like DriveSavers, which gets about 2,000 gigs a month to retrieve photos, videos and notes from phones that are damaged or belong to someone who died. But despite all of the enticements in the San Bernardino case, they’ve worked to unlock an iPhone 5c only intermittently.

They’ve made progress. Cobb’s team can spot the encrypted data on an iPhone 5c memory chip They’re exploring how to either alter that data or copy it to another chip. Both scenarios would allow them to reset software that tracks invalid password entries. Otherwise, 10 successive misfires would render the encrypted data permanently inaccessible.

Swapping chips requires soldering, which the iPhone isn’t built to undergo multiple times. They have an adapter that solves the issue, and about 300 old iPhones in their stockpile in case, as one already has, the device gets ruined.

Had they been first to devise a proposed solution, DriveSavers “absolutely” would have told the FBI because their method doesn’t present extraordinary security risks, Cobb said.

But whether it would want to be publicly known as the code cracker in the case, Cobb said that would be “a much bigger, wider conversation” to ponder.

Debate over tech tools’encryption

Before the San Bernardino terror attack, Syed Rizawan Farook’s iPhone was just one fancy Apple device among hundreds of millions worldwide.

But since the California government worker and his wife shot and killed 14 people on December 2, apparently inspired by extremist group IS, his iPhone 5c has become a key witness – and the government wants Apple to make it talk.

The iPhone, WhatsApp, even social media – government authorities say some of tech fans’ favourite playthings are also some of the most powerful, and problematic, weapons in the arsenals of violent extremists.

Now, in a series of quiet negotiations and noisy legal battles, they’re trying to disarm them, as tech companies and civil liberties groups fight back.

The public debate started with a court order that Apple hack a standard encryption protocol to get at data on Farook’s iPhone, but its repercussions are being felt beyond the tech and law enforcement worlds.

“This is one of the harder questions that we will ever have to deal with,” said Albert Gidari, director of privacy at Stanford Law School’s Centre for Internet and Society.

“How far are we going to go? Where does the government power end to collect all evidence that might exist, and whether it infringes on basic rights? There’s no simple answer,” he told DPA.

It’s not new that terrorists and criminals use mainstream technology to plan and co-ordinate, or that law enforcement breaks into it to catch them. Think of criminals planning a robbery by phone, foiled by police listening in.

But as encryption technology and other next-generation data security move conversations beyond the reach of a conventional wiretap or physical search, law enforcement has demanded the industry provide “back-door” technology to access it too.

At the centre of the fray are otherwise mainstream gadgets and platforms that make private, secure and even anonymous data storage and communication commonplace.

Hundreds of millions of iPhones running iOS 8 or higher are programmed with the same auto-encryption protocol that has stymied investigators in the San Bernardino attack and elsewhere.

US authorities are struggling with how to execute a wiretap order on Facebook-owned WhatsApp’s encrypted messaging platform, used by 1 billion people, the New York Times reported.

In a similar case earlier this month, Brazilian authorities arrested a company executive for not providing WhatsApp data the company said it itself could not access.

Belgium’s interior minister Jan Jambon said in November he believed terrorists were using Sony’s PlayStation 4 gaming network to communicate, Politico reported, although media reports dispute his assertions.

In a world where much of social interaction has moved online, it’s only natural that violent extremism has made the move too.

ISIS, in particular, has integrated its real-world operations with the virtual world, using social media like Twitter and YouTube for recruitment and propaganda and end-to-end encryption for secure communication, authorities say.

Law enforcement authorities and government-aligned terror experts call it the “digital jihad”.

Under pressure from governments, social media providers have cracked down on accounts linked to extremists. Twitter reported it had closed 125,000 ISIS-linked accounts since mid-2015.

Most in the industry have drawn the line at any compromise on encryption, however, saying the benefits of secure data outweigh the costs of its abuse by criminals – leaving authorities wringing their hands.

“Something like San Bernardino” or the November 13 terror attack in Paris “can occur with virtually no indications it was about to happen,” retired general and former Obama anti-terror envoy John Allen warned an audience of techies at the South by Southwest digital conference.

Just a day before, US President Barack Obama had made an unprecedented appearance there, calling for compromise in the showdown between government and tech.

Citing examples of child pornographers, airline security and Swiss bank accounts, Obama said authorities must have the ability to search mobile devices, encrypted or not.

But Gidari called it a “Pandora’s box” too dangerous to open.

Google closing in on target of full encryption

Google is disclosing how much of the traffic to its search engine and other services is being protected from hackers as part of its push to encrypt all online activity.

Encryption shields 77 percent of the requests sent from around the world to Google’s data centers, up from 52 percent at the end of 2013, according to company statistics released Tuesday. The numbers cover all Google services except its YouTube video site, which has more than 1 billion users. Google plans to add YouTube to its encryption breakdown by the end of this year.

Encryption is a security measure that scrambles transmitted information so it’s unintelligible if intercepted by a third party.

Google began emphasizing the need to encrypt people’s online activities after confidential documents leaked in 2013 by former National Security Agency contractor Edward Snowden revealed that the U.S. government had been vacuuming up personal data transferred over the Internet. The surveillance programs exploited gaping holes in unencrypted websites.

While rolling out more encryption on its services, Google has been trying to use the clout of its influential search engine to prod other websites to strengthen their security.

In August 2014, Google revised its secret formula for ranking websites in its search order to boost those that automatically encrypted their services. The change meant websites risked being demoted in Google’s search results and losing visitors if they didn’t embrace encryption.

Google is highlighting its own progress on digital security while the FBI and Apple Inc. are locked in a court battle over access to an encrypted iPhone used by one of the two extremist killers behind the mass shootings in San Bernardino, California, in December.

Google joined several other major technology companies to back Apple in its refusal to honor a court order to unlock the iPhone, arguing that it would require special software that could be exploited by hackers and governments to pry their way into other encrypted devices.

In its encryption crusade, Google is trying to make it nearly impossible for government spies and other snoops from deciphering personal information seized while in transit over the Internet.

The statistics show that Google’s Gmail service is completely encrypted as long as the correspondence remains confined to Gmail. Mail exchanges between Gmail and other email services aren’t necessarily encrypted.

Google’s next most frequently encrypted services are maps (83 percent of traffic) and advertising (77 percent, up from just 9 percent at the end of 2013). Encryption frequency falls off for Google’s news service (60 percent) and finance (58 percent).

Government says Apple arguments in encryption case a “diversion”, presents point-by-point rebuttal

As the Apple vs. FBI encryption debate heats up in California, the U.S. government on Thursday fired back at Apple’s oppositions to a court order compelling its assistance in an FBI investigation, and in a new motion discounted a number of arguments related to supposed backdoors, “master keys,” the All Writs Act and more.

Government says Apple arguments in encryption case a "diversion", presents point-by-point rebuttal

In its letter in support of a federal magistrate judge’s original order to compel Apple’s help in unlocking an iPhone used by San Bernardino terror suspect Syed Rizwan Farook, federal prosecutors intimate the company is playing to the media in an attempt to protect its brand. The document was penned by U.S. Attorneys for the Central District of California Eileen M. Decker, Chief of the Cyber and intellectual Property Crimes Section Tracy L. Wilkison and Chief of the National Security Division Patricia A. Donahue.

“Apple and its amici try to alarm this Court with issues of network security, encryption, back doors, and privacy, invoking larger debates before Congress and in the news media. That is a diversion. Apple desperately wants—desperately needs—this case not to be ‘about one isolated iPhone,'” the letter reads. (Emphasis in original.)

The government argues Farook’s phone may contain actionable intelligence that could help shed light on last year’s terror attack. Investigators need Apple’s help in acquiring said information, if it exists, but instead of providing aid as it has done in the past, the company is waging a war of words both in court and publicly. Prosecutors classify Apple’s statements, including arguments that weakening the security of one iPhone is a slippery slope to a surveillance state, as “not only false, but also corrosive of the very institutions that are best able to safeguard our liberty and our rights.”

One of Apple’s main targets is the All Writs Act, a contingency that imbues courts with the power to issue orders if no other judicial tools are available. After being met with resistance to an initial warrant, the FBI leveraged AWA as a legal foundation to compel Apple’s assistance. If the DOJ is successful in its court action, it could pave the way for broader application of the statute in other investigations, Apple says. Indeed, the FBI is currently asserting AWA in at least nine other cases involving iOS devices.

In this case, however, the government argues its use of AWA is proper.

As for undue burden, the letter notes Apple grosses hundreds of billions of dollars each year. It would take as few as six employees plucked from Apple’s workforce of approximately 100,000 people as little as two weeks to create a workable solution to the FBI’s problem, the letter says, adding that the company is to blame for being in the position it currently finds itself.

“This burden, which is not unreasonable, is the direct result of Apple’s deliberate marketing decision to engineer its products so that the government cannot search them, even with a warrant,” according to the government.

A few interesting tidbits were also revealed in the course of dismantling Apple’s opposition, including a technical revelation that strikes at the heart of one of Apple’s key arguments. Apple has maintained that a forced iCloud backup, obtained by connecting Farook’s iPhone to a known Wi-Fi network, might contain information FBI agents are looking for. However, that option was rendered moot after the FBI ordered San Bernardino officials to reset Farook’s Apple ID password.

“The evidence on Farook’s iCloud account suggests that he had already changed his iCloud password himself on October 22, 2015—shortly after the last backup—and that the autobackup feature was disabled. A forced backup of Farook’s iPhone was never going to be successful, and the decision to obtain whatever iCloud evidence was immediately available via the password change was the reasoned decision of experienced FBI agents investigating a deadly terrorist conspiracy,” the government claims.

Finally, the letter takes issue with Apple’s assertions that the instant order violates its First and Fifth Amendment rights. Apple claims computer code should be covered by free speech protections, meaning DOJ requests to write code in an attempt to break into Farook’s iPhone amounts to forced speech. Nebulous legal footing aside, Apple’s claims are “particularly weak because it does not involve a person being compelled to speak publicly, but a for-profit corporation being asked to modify commercial software that will be seen only by Apple”

The idea of narrow investigation is mentioned multiple times. Apple is not being required to create a master key for all iOS devices, government representatives insist, but instead a piece of code applicable to one iPhone. Even if hackers or nefarious agents manage to steal said code, it would only be useful in unlocking Farook’s iPhone 5c, the government attests. This issue is under debate, however, as some experts say the flawed iOS version could be used on other devices. Creating a specialized forensics tool also acts as a proof-of-concept that iOS is vulnerable to attack.

Apple and the DOJ are set to meet in court over the matter in a hearing scheduled for March 22.

No room for compromise in Apple vs FBI iPhone encryption battle

No room for compromise in Apple vs FBI iPhone encryption battle

As Apple’s legal battle with the FBI over encryption heads toward a showdown, there appears little hope for a compromise that would placate both sides and avert a divisive court decision.

The FBI is pressing Apple to develop a system that would allow the law enforcement agency to break into a locked iPhone used by one of the San Bernardino attackers, a demand the tech company claims would make all its devices vulnerable.

In an effort to break the deadlock, some US lawmakers are pushing for a panel of experts to study the issue of access to encrypted devices for law enforcement in order to find common ground.

Senator Mark Warner and Representative Mike McCaul on Monday proposed the creation of a 16-member “National Commission on Security and Technology Challenges.”

But digital rights activists warn that the issue provides little middle ground — that once law enforcement gains a “back door,” there would be no way to close it.

“We are concerned that the commission may focus on short-sighted solutions involving mandated or compelled back doors,” said Joseph Hall, chief technologist at the Center for Democracy & Technology.

“Make no mistake, there can be no compromise on back doors. Strong encryption makes anyone who has a cell phone or who uses the Internet far more secure.”

Kevin Bankston of the New America Foundation’s Open Technology Institute expressed similar concerns.

“We’ve already had a wide range of blue ribbon expert panels consider the issue,” he said.

“And all have concluded either that surveillance back doors are a dangerously bad idea, that law enforcement’s concerns about ‘going dark’ are overblown, or both.”

The debate had been simmering for years before the Apple-FBI row.

Last year, a panel led by Massachusetts Institute of Technology scientists warned against “special access” for law enforcement, saying they pose “grave security risks” and “imperil innovation.”

Opening up all data

“I’m not sure there is much room for compromise from a technical perspective,” said Stephen Wicker, a Cornell University professor of computer engineering who specializes in mobile computing security.

Opening the door to the FBI effectively makes any data on any mobile device available to the government, he said.

“This is data that was not available anywhere 10 years ago, it’s a function of the smartphone,” Wicker said.

“We as a country have to ask if we want to say that anything outside our personal human memory should be available to the federal government.”

Apple has indicated it is ready for a “conversation” with law enforcement on the matter.

But FBI Director James Comey told a congressional panel that some answers are needed because “there are times when law enforcement saves our lives, rescues our children.”

Asked about the rights envisioned by the framers of the US constitution, he said, “I also doubt that they imagined there would be any place in American life where law enforcement, with lawful authority, could not go.”

A brief filed on behalf of law enforcement associations argued that because of Apple’s new encryption, criminals “have now switched to the new iPhones as the device of choice for their criminal wrongdoing.”

Ed Black, president of the Computer & Communications Industry Association, which includes major technology firms but not Apple, said that although tech firms and law enforcement have had many battles, “there are many areas where we cooperate and where we find middle ground.”

But Black said the tech sector is largely united in this case because the FBI wants Apple to create weaker software or introduce “malware” to be able to crack the locked iPhone.

“On this narrow specific issue of ‘can companies be compelled to create malware,’ I think there may not be an answer,” he said.

‘Going dark’ fears

Law enforcement fears about “going dark” in the face of new technology have been largely exaggerated, Black said.

While access to encrypted apps and smartphones is difficult and traditional wiretaps don’t work on new technology, “there are a lot of other tools for law enforcement,” he said.

“There is more information available in 2016 than in any year since the founding of the country.”

Although law enforcement has growing expectations about using technology to thwart criminals, that type of power is too broad, Black added.

“If they are seeking a level of total surveillance capability, I don’t see a compromise available,” he said.

Wicker said that to give law enforcement access, Congress could in theory mandate that devices use automatic cloud backups that could not be disabled. But that would constitute a dramatic departure from current views about privacy.

“From an individual rights standpoint,” he said, “that would take away control by the user of their personal information.”

Amazon Dropping Fire Encryption Has Nothing to Do With Apple

Amazon Dropping Fire Encryption Has Nothing to Do With Apple

Today, several reports pointed out that Amazon’s Fire OS 5 does not support device encryption, drawing a connection between the company’s encryption retreat and the current Apple-FBI iPhone unlocking fracas. But Amazon’s decision to remove Fire OS 5’s onboard encryption is not a new development, and it’s not related to the iPhone fight. The real question at hand is why Amazon decided to roll back encryption protection for consumers all on its own.

Introduced last fall, Amazon’s Fire OS 5 featured a refreshing redesign that added several usability features. But Fire OS 5 also took away device encryption support, while still maintaining security features for communication between devices and Amazon’s cloud.

“In the fall when we released Fire OS 5, we removed some enterprise features that we found customers weren’t using,” Amazon spokesperson Robin Handaly told WIRED. “All Fire tablets’ communication with Amazon’s cloud meet our high standards for privacy and security, including appropriate use of encryption.”

We’ve reached out again for clarification as to what “appropriate use” of encryption entails in Amazon’s view.

To be clear, removing encryption protections of any kind from Fire tablets should be seen as a step back for consumers, and for security as a whole.

“Amazon’s decision is backward—it not only moves away from default device encryption, where other manufacturers are headed, but removes all choice by the end user to decide to encrypt it after purchase,” says Nathan White, Senior Legislative Manager at digital rights organization Access Now. “The devices themselves also become more attractive targets for thieves. Users should no longer trust these devices: If you wouldn’t post it to the internet publicly, don’t put it on a Fire Tablet.”

Further, Amazon’s insistence that it maintains a secure connection with the cloud doesn’t ease concerns over the data on the device itself that’s now vulnerable.

“Data encryption at rest and data encryption in motion are two completely different things,” says White. “They shouldn’t conflate two important issues by saying ‘we encrypt in motion, so data at rest doesn’t matter.’”

Even without the cloud connection, a device stores all sorts of personal information, from email credentials to credit card numbers to sensitive business information, if you happen to be an enterprise user. In fact, the lack of encryption means corporate customers aren’t able to use certain email clients on Fire tablets any longer.

Amazon’s move is a bad one. But it’s not a retreat in the face of Apple-FBI pressures. For better or worse (mostly worse), it’s been this way for months. As Handaly noted, Fire OS 5 came out last fall, on a suite of new Amazon devices. Amazon message board users have been commenting on, and complaining about, the absence of encryption since at least early January.

So why the sudden focus? Likely because of this tweet:

Amazon Dropping Fire Encryption Has Nothing to Do With Apple

People are talking about the lack of encryption today because the OS update is only now hitting older devices, like the fourth-generation Fire HD and Fire HDX 8.9. Despite how neatly the sudden forfeiture of encryption by a tech giant fits the Apple-FBI narrative, this encryption deprecation isn’t related to that battle. Instead, Amazon appears to have given up onboard encryption without any public fight at all.

“This move does not help users. It does not help corporate image. And it does not fit into industry trends,” says Amie Stepanovich, US Policy Manager at Access Now.