Apple’s rivals wary of taking stand on encryption issue, against the FBI

Apple’s rivals wary of taking stand on encryption issue, against the FBI

As Apple resists the US government in a high profile stand-off over privacy, rival device makers are, for now, keeping a low profile.

Most are Asian companies — the region produces eight of every 10 smartphones sold around the world — and operate in a complex legal, political and security landscape.

Only China’s Huawei has publicly backed Apple CEO Tim Cook in his fight to resist demands to unlock an encrypted iPhone belonging to one of those who went on a shooting rampage in San Bernardino, California in December.

“We put a lot of investment into privacy, and security protection is key. It is very important for the consumer,” Richard Yu, chief executive of Huawei’s consumer business group, told reporters at the Mobile World Congress earlier this week.

But Yu stopped short of saying explicitly that Huawei would adopt the same stance. “Some things the government requires from vendors we cannot do,” he said, citing an example of unlocking an encrypted Android device. “These are important things for the consumer, for privacy protection.”

Lenovo Group CEO Yang Yuanqing declined to say whether he backs the Apple position, saying the issue required time and consideration.

“Today it happens to Apple, tomorrow it could happen to Lenovo mobile phones. So we must be very serious to consider. We need to take some time,” Yang told Reuters.

Samsung Electronics Co and Chinese device maker Xiaomi declined to comment, while ZTE Corporation did not respond to requests for comments.

South Korean mobile maker LG Electronics Inc said it takes personal privacy and security very seriously, but declined to say whether it had ever worked with any government to insert so-called “backdoors” into its products or whether it had ever been asked to unlock a smartphone.

“Nobody wants to be seen as a roadblock to an investigation,” said a spokesperson for Micromax, India’s biggest local smartphone maker. “Nobody wants that kind of stigma. We have to take care of both customer security as well as (a) genuine threat to national security.”

Many Asian countries don’t have privacy laws that device makers can fall back on to resist demands from law enforcement authorities.

“As part of the evidence gathering process provided for under the law, law enforcement agencies in Singapore may request information from persons or organizations,” Singapore’s Ministry of Home Affairs Spokesperson told Reuters.
An official at India’s telecom regulator said authorities can ask for private user data from technology companies, as can those in Indonesia, said Ismail Cawidu, spokesman for Indonesia’s Communication and Information Ministry.
Eugene Tan, associate professor of law at the Singapore Management University, said he wouldn’t be surprised if technology firms weren’t being asked for access to their devices.

“It’s just that these are not made public. You can imagine for the technology companies, they are also concerned about the publicity — if they are seen to be caving in to law enforcement agencies, there is always a fear that people may not use their products and services,” he said.

Micromax said this was commonplace in India. “I can’t say no to a law enforcement request, and every day there is one,” the company’s spokesperson said. “You have to comply with requests in the larger interest of national security.”

The Apple battle may even spur regulators in some markets to demand device makers to grant them access.
Thailand’s telecoms regulator said it is studying the possibility of having separate agreements with handset makers and social media firms such as Facebook and Naver’s LINE to help extract data from mobile phones.
“There is political pressure” for regulating devices, said Rob Bratby, manager of Olswang Asia, a technology-focused law firm based in Singapore.

He said there was no evidence of any such regulatory interest yet, but it was a matter of time.

Encryption is Not a Threat to Our Safety, But Political Correctness is

Encryption is Not a Threat to Our Safety, But Political Correctness is

The legal battle between Apple, Inc and the US government has no sign of abating. Tim Cook, CEO of Apple, indicated that he is willing to fight the US government all the way to the Supreme Court. Apple Inc. just upped the ante by announcing that its engineers are working on new iPhone security features, which would make the iphone almost impossible to hack into by the company itself or government agencies. On the other hand, many government officials and politicians argue that encryption deprives them of opportunities to track the activities of bad guys and stop them from doing harm. Some in Congress are working on a new law to compel technology companies to grant the US government “limited” access by circumventing encryption.

Supporters of either Apple or the US government have written extensively on privacy vs. security issues. But something else has been missing in the current debate. Let’s revisit the San Bernardino terrorist attack. It’s worth remembering that one of the San Bernardino shooters, Tashfeen Malik, didn’t encrypt her radical and anti-America thoughts and ideas on Facebook prior to her visa application, they were posted for anyone to read. But our immigration officials were prevented from reviewing her easily accessible social media postings because the Secretary of Homeland Security feared a civil liberty backlash and bad PR. There is no legal basis for the Secretary’s concern. America has no obligation to grant a visa to any non-US citizen who expresses anti-America sentiment. It was widely reported after the San Bernardino shooting that Tashfeen Malik was responsible for radicalizing her husband Syed. Had someone at the Department of Homeland done half an hour google search, and accordingly denied Tashfeen’s fiancées visa, fourteen lives in San Bernardino could have been saved.

Failing to vet Tashfeen Malik adequately was not an outlier case. The leadership of the Department of Homeland security has a history of ” willingness to compromise the security of citizens for the ideological rigidity of political correctness.” Philip Haney, a former officer who spent 15 years at the Department of Homeland Security (DHS), wrote for the Hill recently that back in 2009, he was ordered by his supervisor at DHS ” to delete or modify several hundred records of individuals tied to designated Islamist terror groups like Hamas from the important federal database, the Treasury Enforcement Communications System (TECS).”

Apple and FBI to testify before Congress next week over encryption

Apple and FBI to testify before Congress next week over encryption

Over the past few days, Apple has made it abundantly clear that it will not comply with the FBI’s demand that it write a new piece of software to help bypass built-in iPhone security measures.

On the contrary, Apple has said that it wants the FBI to withdraw all of its demands while adding that the only way to move forward is to form a commission of experts on intelligence, technology, and civil liberties to discuss “the implications for law enforcement, national security, privacy, and personal freedoms.”

In the meantime, Apple has vehemently argued that Congress should be tasked with determining the fate of the shooter’s iPhone, not the courts. Come next Tuesday, Apple will finally be able to plead its case directly in front of our country’s lawmakers.

Earlier today, the House Judiciary Committee announced that it will be holding a congressional hearing on encryption on Tuesday, March 1. The hearing itself is called, “The Encryption Tightrope: Balancing Americans’ Security and Privacy.”

Slated to testify on the first panel is FBI director James Comey who, you might recall, recently penned a blogpost arguing that the current debate isn’t about the implications of encryption, but rather about “the victims and justice.”

On the second panel, Apple’s top lawyer, Bruce Sewell, will appear and present Apple’s case. Appearing alongside him will be Susan Landau, a cybersecurity expert, and New York District Attorney Cyrus R. Vance, Jr.

A statement from the House Judiciary Committee on the upcoming hearing reads as follows:

Apple and FBI to testify before Congress next week over encryption

This should undoubtedly make for a lively hearing.

Speaking to the seriousness with which Apple views this debate, Tim Cook yesterday said that helping the FBI would be tantamount to creating the “software equivalent of cancer.”

Apple CEO defends position in encryption dispute with feds

Apple CEO defends position in encryption dispute with feds

Apple CEO Tim Cook said in an interview Wednesday it was a tough decision to resist a court order directing the tech giant to override security features on the iPhone used by one of the San Bernardino gunmen who killed 14 people in a December terror attack.

However, Cook reiterated to ABC News in his first interview since the controversy erupted last week that if his company complied with the FBI’s demand to unlock Syed Rizwan Farook’s encrypted phone it would be “bad for America.”

“Some things are hard and some things are right, and some things are both. This is one of those things,” Cook said. The interview came as both sides in the dispute are courting public support, through interviews and published statements, while also mustering legal arguments in the case.

Federal authorities have insisted they’re only asking for narrow assistance in bypassing some security features on the iPhone, which they believe contains information related to the mass murders. Apple argues that doing so would make other iPhones more susceptible to hacking by authorities or criminals in the future.

The Apple chief expressed sympathy for the shooting victims’ families, and said his company provided engineers and technical advice to authorities investigating the case. But he said authorities are now asking the company “to write a piece of software that we view as sort of the equivalent of cancer.”

The software could “expose people to incredible vulnerabilities,” Cook added, arguing that smartphones contain private information about users and even their families.

“This would be bad for America,” he said. “It would also set a precedent that I believe many people in America would be offended by.”

Meanwhile, Attorney General Loretta Lynch defended the FBI’s push to access the locked phone Wednesday, saying judges at all levels have held such companies “must assist if it is reasonably within their power to do so – and suggesting Congress does not need to get involved as Apple wants.

But Lynch used testimony Wednesday before a House appropriations subcommittee to lay out the DOJ position that courts already have found companies must assist in opening devices.

“If the government needs the assistance of third parties to ensure that the search is actually conducted, judges all over the country and on the Supreme Court have said that those parties must assist if it is reasonably within their power to do so,” she said, without mentioning Apple by name. “And that is what we have been asking, and we owe it to the victims and to the public whose safety we must protect to ensure that we have done everything under the law to fully investigate terrorist attacks on American soil.”

Apple also is expected to argue that the Obama administration’s request to help it hack into an iPhone in the federal investigation of the San Bernardino attack is improper under an 18th century law, the 1789 All Writs Act, which has been used to compel companies to provide assistance to law enforcement.

Magistrate Judge Sheri Pym in California ordered Apple last week to create specialized software to help the FBI hack into a locked, county-issued iPhone used by Farook.

Why Canada isn’t having a policy debate over encryption

Why Canada isn’t having a policy debate over encryption

The legal saga between Apple and the FBI has thrust encryption into the government’s policy spotlight again – but only if you live in the United States. In Canada, you could be excused for not knowing such a debate exists .

Ever since FBI director James Comey characterized the rising tide of encrypted data as “going dark” in an October, 2014 speech, American civil liberties groups, cryptographers, private companies and politicians have argued ceaselessly about encryption’s merits and the dangers of so-called backdoors.

While most acknowledge that encryption keeps vast swaths of Internet communication and services secure, there have nonetheless been calls for legislation, “golden keys” and the formation of encryption committees in response to increasingly vocal arguments that encryption is helping criminals and terrorists operate beyond the law’s reach.

Things culminated last week with the FBI’s order that Apple Inc. modify its software to make it easier for law enforcement to break the iPhone’s security protections – modifications that have been characterized as a backdoor for law enforcement, or criminals, to use again and again.

In Canada, however, policy discussions involving encryption and, more largely, police powers in the digital realm – such as cellphone tracking devices and the use of hacking tools – have been “functionally non-existent,” according to Citizen Lab researcher Christopher Parsons.

“We haven’t had the kind of debate and back and forth and public positions taken that you see in the United States, you see in the United Kingdom. We just don’t do it here,” Mr. Parsons said.

Some of the reasons are familiar. There is, for example, a comparatively smaller policy community in Canada that focuses on these issues than there is in the U.S., and a smaller amount of case law – not to mention the fact that previous governments have shown more interest in expanding police powers, rather than curtailing or even detailing them.

And if past U.S. cases are any indication, the government will just as easily benefit by staying out of the debate and piggybacking on the outcome of the FBI’s case.

“They can dodge the debate and benefit from it without having to engage in it,” said Tamir Israel, a staff lawyer with the Canadian Internet Policy and Public Interest Clinic. “And then the other side to that is they often will find quieter ways to get comparable results where they can’t directly piggyback.”

By way of example, Mr. Israel pointed to the Solicitor General’s Enforcement Standards (SGES), which outline 23 technical surveillance standards that must be followed as a condition of obtaining a wireless spectrum licence in Canada. After the U.S. passed lawful surveillance legislation called the Communications Assistance for Law Enforcement Act in the 1990s, Canada used the SGES to quietly introduce similar standards.

Although the standards were introduced in the mid-1990s and updated again in 2008, details were not made public until The Globe and Mail obtained past and current versions of the documents in 2013.

Mr. Israel pointed to a wider problem preventing a successful encryption debate in Canada: a lack of transparency surrounding the government’s position and policies. He raised cellphone tracking technology called Stingrays, or IMSI catchers, as an example. “I personally find it very hard to believe that no law enforcement agencies in Canada are using these. But we can’t even get the debate going, because we can’t get past that first step where any of them admit that they’re using them.”

The RCMP would not comment on Apple’s dispute with the FBI but said in a statement: “International police agencies are all in agreement that some ability to access evidence when judicial authorization is granted is required, recognizing that secure data and communications enables commerce and social interactions in today’s reality. These are complex challenges which the RCMP continues to study.”

The statement continued: “The RCMP encourages public discourse with Canadians as public policy continues to take shape on the issue of encryption.”

The Office of the Privacy Commissioner of Canada said in an e-mail that it was not aware of any government agencies that have proposed backdoors in Canadian companies or Internet service providers, and that it is following encryption discussions “with interest.”

When reached via e-mail, Liberal MP Robert Oliphant, who chairs the standing committee on public safety and national security, wrote that, “while encryption and backdoors are of great concern to a number of people, they have not yet surfaced as issues for our committee in its early days.”

However, he added, the committee is still “sifting through all the important issues of safety and security and will be setting our work plan shortly.”

Public Safety Canada said in a statement that it is “monitoring the ongoing debate in the U.S. and other countries on the issue of government access to encrypted data” and that “no special events related to encryption” are currently planned.

NDP MP and committee vice-chair Brian Masse, echoing Mr. Oliphant’s statement, added that any proposed legislative changes involving encryption or backdoors should be handled democratically and involve both the Privacy Commissioner and Parliament.

Meanwhile, neither the chair nor vice-chairs of the standing committee on industry, science and technology responded to a request for comment.

A small comfort, Citizen Lab’s Mr. Parsons argued, is that Canadian politicians have shown themselves to be more level-headed and avoided the sky-is-falling rhetoric of their counterparts in the U.S., where Senator Dianne Feinstein, who chairs the Senate select committee on intelligence, stated earlier this month that “an Internet connection and an encrypted message application” is all Islamic State militants need to carry out an attack.

If this issue is going to be given some weight, Mr. Parsons suggested, “committee meetings that very seriously look into this while there isn’t a terror moment, it’s the ideal way of going.”

San Bernardino victims to oppose Apple on iPhone encryption

San Bernardino victims to oppose Apple on iPhone encryption

Some victims of the San Bernardino attack will file a legal brief in support of the U.S. government’s attempt to force Apple Inc to unlock the encrypted iPhone belonging to one of the shooters, a lawyer representing the victims said on Sunday.

Stephen Larson, a former federal judge who is now in private practice, told Reuters that the victims he represents have an interest in the information which goes beyond the Justice Department’s criminal investigation.

“They were targeted by terrorists, and they need to know why, how this could happen,” Larson said.

Larson said he was contacted a week ago by the Justice Department and local prosecutors about representing the victims, prior to the dispute becoming public. He said he will file an amicus brief in court by early March.

A Justice Department spokesman declined to comment on the matter on Sunday.

Larson declined to say how many victims he represents. Fourteen people died and 22 others were wounded in the shooting attack by a married couple who were inspired by Islamic State militants and died in a gun battle with police.

Entry into the fray by victims gives the federal government a powerful ally in its fight against Apple, which has cast itself as trying to protect public privacy from overreach by the federal government.

An Apple spokesman declined to comment. In a letter to customers last week, Tim Cook, the company’s chief executive, said: “We mourn the loss of life and want justice for all those whose lives were affected,” saying that the company has “worked hard to support the government’s efforts to solve this horrible crime.”

Federal Bureau of Investigation Director James Comey said in a letter released on Sunday night that the agency’s request wasn’t about setting legal precedent, but rather seeking justice for the victims and investigating other possible threats.

“Fourteen people were slaughtered and many more had their lives and bodies ruined. We owe them a thorough and professional investigation under law. That’s what this is,” Comey wrote.

The FBI is seeking the tech company’s help to access shooter Syed Rizwan Farook’s phone by disabling some of its passcode protections. The company so far has pushed back, arguing that such a move would set a dangerous precedent and threaten customer security.

The clash between Apple and the Justice Department has driven straight to the heart of a long-running debate over how much law enforcement and intelligence officials should be able to monitor digital communications.

The Justice Department won an order in a Riverside, California federal court on Tuesday against Apple, without the company present in court. Apple is scheduled to file its first legal arguments on Friday, and U.S. Magistrate Judge Sheri Pym, who served as a federal prosecutor before being appointed to the bench, has set a hearing on the issue for next month.

Larson once presided over cases in Riverside, and Pym argued cases in Larson’s courtroom several times as a prosecutor while Larson was a judge, he said. Larson returned to private practice in 2009, saying at the time that a judge’s salary was not enough to provide for his seven children.

He said he is representing the San Bernardino victims for free.

What Tim Cook doesn’t want to admit about iPhones and encryption

What Tim Cook doesn't want to admit about iPhones and encryption

When Hillary Clinton called for a “Manhattan-like project” to find a way for the government to spy on criminals without undermining the security of everyone else’s communications, the technology world responded with mockery.

“Also we can create magical ponies who burp ice cream while we’re at it,” snarked prominent Silicon Valley investor Marc Andreessen. Clinton’s idea “makes no sense,” added Techdirt’s Mike Masnick, because “backdooring encryption means that everyone is more exposed to everyone, including malicious hackers.”

It’s an argument that’s been echoed by Apple CEO Tim Cook, who is currently waging a legal battle with the FBI over its request to unlock the iPhone of San Bernardino terrorism suspect Syed Rizwan Farook. “You can’t have a backdoor that’s only for the good guys,” Cook said in November.

There’s just one problem: This isn’t actually true, and the fight over Farook’s iPhone proves it. Apple has tacitly admitted that it can modify the software on Farook’s iPhone to give the FBI access without damaging the security of anyone else’s iPhone.

Claiming that secure back doors are technically impossible is politically convenient. It allows big technology companies like Apple to say that they’d love to help law enforcement but don’t know how to do it without also helping criminals and hackers.

But now, faced with a case where Apple clearly can help law enforcement, Cook is in the awkward position of arguing that it shouldn’t be required to.

Apple isn’t actually worried about the privacy of a dead terrorism suspect. Cook is worried about the legal precedent — not only being forced to help crack more iPhones in the future, but conceivably being forced to build other hacking tools as well.

But by taking a hard line in a case where Apple really could help law enforcement in an important terrorism case — and where doing so wouldn’t directly endanger the security of anyone else’s iPhone — Apple risks giving the impression that tech companies’ objections aren’t being made entirely in good faith.

The San Bernardino case shows secure back doors are possible

What Tim Cook doesn't want to admit about iPhones and encryption

Technologists aren’t lying when they say secure back doors are impossible. They’re just talking about something much narrower than what the term means to a layperson. Specifically, their claim is that it’s impossible to design encryption algorithms that scramble data in a way that the recipient and the government — but no one else — can read.

That’s been conventional wisdom ever since 1994, when a researcher named Matt Blaze demonstrated that a government-backed proposal for a back-doored encryption chip had fatal security flaws. In the two decades since, technologists have become convinced that this is something close to a general principle: It’s very difficult to design encryption algorithms that are vulnerable to eavesdropping by one party but provably secure against everyone else. The strongest encryption algorithms we know about are all designed to be secure against everyone.

But the fact that we don’t know how to make an encryption algorithm that can be compromised only by law enforcement doesn’t imply that we don’t know how to make a technology product that can be unlocked only by law enforcement. In fact, the iPhone 5C that Apple and the FBI are fighting about this week is a perfect example of such a technology product.

You can read about how the hack the FBI has sought would work in my previous coverage, or this even more detailed technical analysis. But the bottom line is that the technology the FBI is requesting — and that Apple has tacitly conceded it could build if forced to do so — accomplishes what many back door opponents have insisted is impossible.

Without Apple’s help, Farook’s iPhone is secure against all known attacks. With Apple’s help, the FBI will be able to crack the encryption on Farook’s iPhone. And helping the FBI crack Farook’s phone won’t help the FBI or anyone else unlock anyone else’s iPhone.

It appears, however, that more recent iPhones are not vulnerable to the same kind of attack. (Update: Apple has told Techcrunch that newer iPhones are also vulnerable.) If Farook had had an iPhone 6S instead of an iPhone 5C, it’s likely (though only Apple knows for sure) that Apple could have truthfully said it had no way to help the FBI extract the data.

That worries law enforcement officials like FBI Director James Comey, who has called on technology companies to work with the government to ensure that encrypted data can always be unscrambled. Comey hasn’t proposed a specific piece of legislation, but he is effectively calling on Apple to stop producing technology products like the iPhone 6S that cannot be hacked even with Apple’s help.

The strongest case against back doors is repressive regimes overseas

What Tim Cook doesn't want to admit about iPhones and encryption

If you have a lot of faith in the US legal system (and you’re not too concerned about the NSA’s creative interpretations of surveillance law), Comey’s demand might seem reasonable. Law enforcement agencies have long had the ability to get copies of almost all types of private communication and data if they first get a warrant. There would be a number of practical problems with legally prohibiting technology products without back doors, but you might wonder why technology companies don’t just voluntarily design their products to comply with lawful warrants.

But things look different from a global perspective. Because if you care about human rights, then you should want to make sure that ordinary citizens in authoritarian countries like China, Cuba, and Saudi Arabia also have access to secure encryption.

And if technology companies provided the US government with backdoor access to smartphones — either voluntarily or under legal compulsion — it would be very difficult for them to refuse to extend the same courtesy to other, more authoritarian regimes. In practice, providing access to the US government also means providing access to the Chinese government.

And this is probably Apple’s strongest argument in its current fight with the FBI. If the US courts refuse to grant the FBI’s request, Apple might be able to tell China that it simply doesn’t have the software required to help hack into the iPhone 5C’s of Chinese suspects. But if Apple were to create the software for the FBI, the Chinese government would likely put immense pressure on Apple to extend it the same courtesy.

Google CEO Pichai Lends Apple Support on Encryption

Google CEO Pichai Lends Apple Support on Encryption

Google Chief Executive Sundar Pichai lent support to Apple Inc.’s  pushback against a federal order to help law enforcement break into the locked iPhone of an alleged shooter in the San Bernardino, Calif., attacks.

Mr. Pichai wrote on Twitter on Wednesday that “forcing companies to enable hacking could compromise users’ privacy.”

Google CEO Pichai Lends Apple Support on Encryption

A federal judge Tuesday ordered Apple to enable investigators to bypass the passcode of the iPhone once used by alleged shooter Syed Rizwan Farook. Apple CEO Tim Cook wrote on Apple’s website that such a move would create “a backdoor” around security measures hackers could eventually use to steal iPhone users’ data.

On Twitter, Mr. Pichai called Mr. Cook’s letter an “important post.” He said that while Alphabet Inc.’s Google provides user data to law enforcement under court orders, “that’s wholly different than requiring companies to enable hacking of customer devices and data. Could be a troubling precedent.”

Google CEO Pichai Lends Apple Support on Encryption

Google, like Apple, has been locked in an intensifying battle with U.S. authorities over the companies’ smartphone encryption software. The firms say that the encryption is crucial to protecting users’ privacy, and keeping their trust. Law enforcement officials say such software hinders criminal investigations, including into the San Bernardino attacks.

Here’s why the FBI forcing Apple to break into an iPhone is a big deal

iphone 4s

When U.S. Magistrate Sheri Pym ruled that Apple must help the FBI break into an iPhone belonging to one of the killers in the San Bernardino, Calif., shootings, the tech world shuddered.

Why? The battle of encryption “backdoors” has been longstanding in Silicon Valley, where a company’s success could be made or broken based on its ability to protect customer data.

The issue came into the spotlight after Edward Snowden disclosed the extent to which technology and phone companies were letting the U.S. federal government spy on data being transmitted through their network.

Since Edward Snowden’s whistleblowing revelations, Facebook, Apple and Twitter have unilaterally said they are not going to create such backdoors anymore.

So here’s the “backdoor” the FBI wants: Right now, iPhone users have the option to set a security feature that only allows a certain number of tries to guess the correct passcode to unlock the phone before all the data on the iPhone is deleted. It’s a security measure Apple put in place to keep important data out of the wrong hands.

Federal prosecutors looking for more information behind the San Bernardino shootings don’t know the phone’s passcode. If they guess incorrectly too many times, the data they hope to find will be deleted.

That’s why the FBI wants Apple to disable the security feature. Once the security is crippled, agents would be able to guess as many combinations as possible.

Kurt Opsahl, general counsel for the Electronic Frontier Foundation, a San Francisco-based digital rights non-profit, explained that this “backdoor” means Apple will have to to write brand new code that will compromise key features of the phone’s security. Apple has five business days to respond to the request.

What does Apple have to say about this? They haven’t commented yet today, but back in December, Apple CEO Tim Cook defended the company’s use of encryption on its mobile devices, saying users should not have to trade privacy for national security, in a broad interview with 60 Minutes. In the interview, Cook stood by the company’s stance of refusing to offer encrypted texts and messages from users.

He said: “There’s likely health information, there’s financial information,” says Cook describing a user’s iPhone. “There are intimate conversations with your family, or your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It’s because if there’s a way to get in, then somebody will find the way in.”

Cook says Apple cooperates with law enforcement requests, but can’t access encrypted information on users’ smartphones. According to a page on Apple’s website detailing government requests, Apple says encryption data is tied to the device’s passcode.

Cook also dismissed the idea that iPhone users should swap privacy for security. “We’re America. We should have both.”

What does this mean for the next time the government wants access? The order doesn’t create a precedent in the sense that other courts will be compelled to follow it, but it will give the government more ammunition.

What do digital rights experts have to say? There are two things that make this order very dangerous, Opsahl said. The first is the question is raises about who can make this type of demand. If the U.S. government can force Apple to do this, why can’t the Chinese or Russian governments?

The second is that while the government is requesting a program to allow it to break into this one, specific iPhone, once the program is created it will essentially be a master key. It would be possible for the government to take this key, modify it and use it on other phones. That risks a lot, that the government will have this power and it will not be misused, he said.

And the lawmakers? Well, they are torn. Key House Democrat, Rep. Adam Schiff, D-Calif., says Congress shouldn’t force tech companies to have encryption backdoors. Congress is struggling with how to handle the complex issue.

On the other side of things, Senate Intelligence Committee Chairman Richard Burr, R-N.C., and Vice Chair Dianne Feinstein, D-Calif., say they want to require tech companies to provide a backdoor into encrypted communication when law enforcement officials obtain a court order to investigate a specific person.

What now? This could push the tech companies to give users access to unbreakable encryption. To some extent, it’s already happening. Companies like Apple and Google — responding to consumer demands for privacy — have developed smart phones and other devices with encryption that is so strong that even the companies can’t break it.

Encryption May Hurt Surveillance, but Internet Of Things Could Open New Doors

Tech companies and privacy advocates have been in a stalemate with government officials over how encrypted communication affects the ability of federal investigators to monitor terrorists and other criminals. A new study by Harvard’s Berkman Center for Internet and Society convened experts from all sides to put the issue in context.

The report concluded that information from some apps and devices like smartphones may be harder for government investigators to intercept because of stronger encryption. But, it said, we are connecting so many more things to the Internet (light bulbs, door locks, watches, toasters) that they could create new surveillance channels.

Encryption May Hurt Surveillance, But Internet Of Things Could Open New Doors

The encryption debate has reheated recently following the attacks in Paris and to some extent San Bernardino, Calif., with CIA and FBI officials warning about their investigation channels “going dark” because of the stronger encryption placed on communications tools like WhatsApp or FaceTime.

(The distinction is this: With things like emails, Web searches, photos or social network posts, information typically gets encrypted on your phone or laptop and then decrypted and stored on a big corporate data server, where law enforcement officials have the technical and legal ability to get access to the content, for instance, with a subpoena. But with messages that are encrypted end-to-end, data gets encrypted on one device and only gets decrypted when it reaches the recipient’s device, making it inaccessible even with a subpoena.)

The agencies have asked for “back doors” into these technologies, though the Obama administration cooled off its push for related legislation late last year over concerns that such security loopholes would also attract hackers and other governments.

But the Harvard report (which was funded by the Hewlett Foundation) argues that “going dark” is a faulty metaphor for the surveillance of the future, thanks to the raft of new technologies that are and likely will remain unencrypted — all the Web-connected home appliances and consumer electronics that sometimes get dubbed the Internet of Things.

Some of the ways the data used to be accessed will undoubtedly become unavailable to investigators, says Jonathan Zittrain, a Harvard professor who was one of the authors. “But the overall landscape is getting brighter and brighter as there are so many more paths by which to achieve surveillance,” he says.

“If you have data flowing or at rest somewhere and it’s held by somebody that can be under the jurisdiction of not just one but multiple governments, those governments at some point or another are going to get around to asking for the data,” he says.

The study team is notable for including technical experts and civil liberties advocates alongside current and former National Security Agency, Defense Department and Justice Department officials. Another chief author was Matthew Olsen, former director of the National Counterterrorism Center and NSA general counsel.

Though not all 14 core members had to agree to every word of the report, they had to approve of the thrust of its findings — with the exception of current NSA officials John DeLong and Anne Neuberger, whose jobs prevented them from signing onto the report (and Zittrain says nothing should be inferred about their views).

The results of the report are a bit ironic: It tries to close one can of worms (the debate over encryption hurting surveillance) but opens another one (the concerns about privacy in the future of Internet-connected everything).

“When you look at it over the long term,” says Zittrain, “with the breadth of ways in which stuff that used to be ephemeral is now becoming digital and stored, the opportunities for surveillance are quite bright, possibly even worryingly so.”