Take a stand against the Obama/FBI anti-encryption charm offensive

It has been frustrating to watch as the horrific San Bernardino terrorist killing spree has been used as a cover by the FBI to achieve the anti-encryption goals they’ve been working towards for years. Much of that frustration stems from the fact that the American media has so poorly reported the facts in this case.

Take a stand against the Obama/FBI anti-encryption charm offensive

The real issue in play is that the FBI wants backdoor access to any and all forms of encryption and is willing to demonize Apple in order to establish an initial precedent it can then use against all other software and hardware makers, all of whom are smaller and are far less likely to even attempt to stand up against government overreach.

However, the media has constantly echoed the FBI’s blatantly false claims that it “does not really want a backdoor,” that only cares about “just this one” phone, that all that’s really involved is “Apple’s failure to cooperate in unlocking” this single device, and that there “isn’t really any precedent that would be set.” Every thread of that tapestry is completely untrue, and even the government has now admitted this repeatedly.

Representative democracy doesn’t work if the population only gets worthless information from the fourth estate.

However, in case after case journalists have penned entertainment posing as news, including a bizarre fantasy written up by Mark Sullivan for Fast Company detailing “How Apple Could Be Punished For Defying FBI.”

A purportedly respectable polling company asked the population whether Apple should cooperate with the police in a terrorism case. But that wasn’t the issue at hand. The real issue is whether the U.S. Federal Government should act to make real encryption illegal by mandating that companies break their own security so the FBI doesn’t have to on its own.

The Government’s Anti-Encryption Charm Offensive

Last Friday, U.S. Attorney General Loretta Lynch made an appearance on The Late Show with Stephen Colbert to again insist that this is a limited case of a single device that has nothing to do with a backdoor, and that it was really an issue of the County-owned phone asking Apple for assistance in a normal customer service call.

Over the weekend, President Obama appeared at SXSW to gain support for the FBI’s case, stating outright that citizens’ expectation that encryption should actually work is “incorrect” and “absolutist.”

He actually stated that, “If your argument is ‘strong encryption no matter what, and we can and should in fact create black boxes,’ that I think does not strike the kind of balance we have lived with for 200, 300 years. And it’s fetishizing our phone above every other value, and that can’t be the right answer.”

That’s simply technically incorrect. There’s no “balance” possible in the debate on encryption. Either we have access to real encryption or we don’t. It very much is an issue of absolutes. Real encryption means that the data is absolutely scrambled, the same way that a paper shredder absolutely obliterates documents. If you have a route to defeat encryption on a device or between two devices, it’s a backdoor, whether the government wants to play a deceptive word game or not.

If the government obtains a warrant, that means its has the legal authority to seize evidence. It does not mean that the agencies involved have unbridled rights to conscript unrelated parties into working on their behalf to decipher, translate or recreate any bits of data that are discovered.

If companies like Apple are forced to build security backdoors by the government to get around encryption, then those backdoors will also be available to criminals, to terrorists, to repressive regimes and to our own government agencies that have an atrocious record of protecting the security of data they collect, and in deciding what information they should be collecting in the first place.

For every example of a terrorist with collaborator contacts on his phone, or a criminal with photos of their crimes on their phone, or a child pornographer with smut on their computer, there are thousands of individuals who can be hurt by terrorists plotting an attack using backdoors to cover their tracks, or criminals stalking their victims’ actions and locations via backdoor exploits of their devices’ security, or criminal gangs distributing illicit content that steps around security barriers the same way that the police hope to step around encryption on devices.

Security is an absolutist position. You either have it or you don’t.

Obama was right in one respect. He noted that in a world with “strong, perfect encryption,” it could be that “what you’ll find is that after something really bad happens the politics of this will swing and it will become sloppy and rushed. And it will go through Congress in ways that have not been thought through. And then you really will have a danger to our civil liberties because the disengaged or taken a position that is not sustainable.”

However, the real answer to avoiding “sloppy, rushed” panic-driven legislation is to instead establish clear rights for citizens and their companies to create and use secure tools, even if there is some fear that secure devices may be used in a way that prevents police from gaining access to some the evidence they might like to access in certain cases.

The United States makes no effort to abridge the use of weapons like those used in San Bernardino to actually commit the atrocity. It should similarly not insist that American encryption should only work with a backdoor open on the side, giving police full access to any data they might want.

It’s not just a bad idea, it’s one that will accomplish nothing because anyone nefarious who wants to hide their data from the police can simply use non-American encryption products that the FBI, the president and the U.S. Congress have no ability to weaken, regardless of how much easier it would make things for police.

New FBI strategy wins back lost ground in encryption fight

New FBI strategy wins back lost ground in encryption fight

By July 2015, FBI Director Jim Comey knew he was losing the battle against sophisticated technologies that allowed criminals to communicate without fear of government surveillance.

In back-to-back congressional hearings that month, Comey struggled to make the case that terrorists and crooks were routinely using such encryption systems to evade the authorities. He conceded that he had no real answer to the problem and agreed that all suggested remedies had major drawbacks. Pressed for specifics, he couldn’t even say how often bureau investigations had been stymied by what he called the “going dark” problem.

“We’re going to try and do that for you, but I’m not optimistic we’re going to be able to get you a great data set,” he told lawmakers.

This week, Comey was back before Congress with a retooled sales pitch. Gone were the vague allusions to ill-defined problems. In their place: a powerful tale of the FBI’s need to learn what is on an encrypted iPhone used by one of the terrorists who killed 14 people in California. “Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t,” Comey wrote shortly before testifying. “But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.”

The tactical shift has won Comey tangible gains. After more than a year of congressional inaction, two prominent lawmakers, Sen. Mark Warner (D-Va.) and House Homeland Security Chairman Michael McCaul (R-Texas), have proposed a federal commission that could lead to encryption legislation. Several key lawmakers, who previously hadn’t chosen sides over encryption, such as Rep. Jim Langevin (D-RI), are siding with the administration in its legal battle with Apple. Likewise, several former national security officials — such as former National Security Agency chief Gen. Michael Hayden and former Director of National Intelligence Mike McConnell — who lined up with privacy advocates in the past have returned to the government side in this case.

“The public debate was not going the FBI’s way and it appears there’s been a deliberate shift in strategy,” said Mike German, a former FBI special agent. “They realized…that the most politically tenable argument was going to be ‘we need access when we have a warrant and in a serious criminal case. All the better if it’s a terrorism case.’”

The catalyst for change has been a high-stakes legal fight in a central California courtroom where Apple seeks to overturn a judge’s order to write new software to help the FBI circumvent an iPhone passcode. Other technology companies such as Microsoft, Google, Facebook and Twitter this week rallied to Apple’s side. The Justice Department, meanwhile, has drawn supporting legal briefs from law enforcement associations as well as families of the San Bernardino victims.

Comey’s evolution may have been foreshadowed last summer. In an August email, Robert Litt, the intelligence community’s top lawyer, wrote colleagues that the mood on Capitol Hill “could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement,” according to The Washington Post.

The Dec. 2 San Bernardino attack, coming less than three weeks after a coordinated series of Islamic State shootings and bombing killed at least 130 people in Paris, reignited law enforcement concern about terrorists’ ability to shield their plotting via encryption. The San Bernardino killers, Syed Farook and his wife Tashfeen Malik, destroyed two cellphones before dying in a gun battle with police. Investigators discovered the iPhone at issue in the courtroom fight inside the Farook family’s black Lexus sedan.

To be sure, Comey’s new strategy thus far has paid only limited dividends. The Warner-McCaul commission, if it is ever formed, may or may not change U.S. encryption policy. Renewed support from former officials, such as Hayden and McConnell, extends only to the San Bernardino case.

Indeed, the FBI director’s hopes for an enduring solution to “the going dark” problem remain aspirational. The White House last fall abandoned plans to seek legislation mandating a technological fix for authorities’ encryption headaches. And since then, the Obama administration has confined itself to jawboning Silicon Valley.

But in choosing to make a fight over the iPhone used by one of the San Bernardino terrorists, Comey has selected an advantageous battlefield. Many encryption supporters say that the San Bernardino case isn’t really about encryption because the FBI is asking Apple to build custom software that bypasses the phone’s passcode, a separate though related security feature. That distinction, however, may be lost on the public and many members of Congress. Some have even speculated the FBI is using the San Bernardino massacre to revive an encryption debate that it appeared to have lost.

“It appears to me they’re using this case specifically to try to enact a policy proposal they could not get through Congress last year,” said Rep. Ted Lieu (D-Calif.), an encryption advocate. “It’s clear to me that the FBI is trying to use this case to influence the public.”

The fight with Apple not only carries the emotional heft of terrorism, but — thanks to the distinction between encryption backdoors and passcode subversion — has drawn many of Comey’s most vocal critics from the national security community back into the fold.

Hayden, the former NSA head, and McConnell, the nation’s ex-intelligence czar, opposed Congress mandating the creation of technological “back doors” for the government to exploit. Yet, on the Apple case, they side with Comey.

“The FBI made this a test case and that was very deliberate on their part, to refocus the conversation,” said Robert Cattanach, a former Justice Department prosecutor. “This is not some abstract principle of privacy versus government overreach. There are real impacts.”

The San Bernardino case could be a win-win for Comey. If Apple prevails in court, Congress might respond by intervening with legislation. Both the FBI and Apple have said Congress is better equipped to manage the issue than courts.

The legal battles also may discourage companies from building strong encryption given the risk of future legal showdowns, said German, who is now a fellow with the Brennan Center for Justice.

“This is less about Apple than about the developer who is sitting in his garage right now creating the next big thing,” he said. “The idea is to make that person realize that the stronger they build the security the harder it will be for them when they get that order to unlock it to do so. There’s an incentive to build a crack in the system.”

No room for compromise in Apple vs FBI iPhone encryption battle

No room for compromise in Apple vs FBI iPhone encryption battle

As Apple’s legal battle with the FBI over encryption heads toward a showdown, there appears little hope for a compromise that would placate both sides and avert a divisive court decision.

The FBI is pressing Apple to develop a system that would allow the law enforcement agency to break into a locked iPhone used by one of the San Bernardino attackers, a demand the tech company claims would make all its devices vulnerable.

In an effort to break the deadlock, some US lawmakers are pushing for a panel of experts to study the issue of access to encrypted devices for law enforcement in order to find common ground.

Senator Mark Warner and Representative Mike McCaul on Monday proposed the creation of a 16-member “National Commission on Security and Technology Challenges.”

But digital rights activists warn that the issue provides little middle ground — that once law enforcement gains a “back door,” there would be no way to close it.

“We are concerned that the commission may focus on short-sighted solutions involving mandated or compelled back doors,” said Joseph Hall, chief technologist at the Center for Democracy & Technology.

“Make no mistake, there can be no compromise on back doors. Strong encryption makes anyone who has a cell phone or who uses the Internet far more secure.”

Kevin Bankston of the New America Foundation’s Open Technology Institute expressed similar concerns.

“We’ve already had a wide range of blue ribbon expert panels consider the issue,” he said.

“And all have concluded either that surveillance back doors are a dangerously bad idea, that law enforcement’s concerns about ‘going dark’ are overblown, or both.”

The debate had been simmering for years before the Apple-FBI row.

Last year, a panel led by Massachusetts Institute of Technology scientists warned against “special access” for law enforcement, saying they pose “grave security risks” and “imperil innovation.”

Opening up all data

“I’m not sure there is much room for compromise from a technical perspective,” said Stephen Wicker, a Cornell University professor of computer engineering who specializes in mobile computing security.

Opening the door to the FBI effectively makes any data on any mobile device available to the government, he said.

“This is data that was not available anywhere 10 years ago, it’s a function of the smartphone,” Wicker said.

“We as a country have to ask if we want to say that anything outside our personal human memory should be available to the federal government.”

Apple has indicated it is ready for a “conversation” with law enforcement on the matter.

But FBI Director James Comey told a congressional panel that some answers are needed because “there are times when law enforcement saves our lives, rescues our children.”

Asked about the rights envisioned by the framers of the US constitution, he said, “I also doubt that they imagined there would be any place in American life where law enforcement, with lawful authority, could not go.”

A brief filed on behalf of law enforcement associations argued that because of Apple’s new encryption, criminals “have now switched to the new iPhones as the device of choice for their criminal wrongdoing.”

Ed Black, president of the Computer & Communications Industry Association, which includes major technology firms but not Apple, said that although tech firms and law enforcement have had many battles, “there are many areas where we cooperate and where we find middle ground.”

But Black said the tech sector is largely united in this case because the FBI wants Apple to create weaker software or introduce “malware” to be able to crack the locked iPhone.

“On this narrow specific issue of ‘can companies be compelled to create malware,’ I think there may not be an answer,” he said.

‘Going dark’ fears

Law enforcement fears about “going dark” in the face of new technology have been largely exaggerated, Black said.

While access to encrypted apps and smartphones is difficult and traditional wiretaps don’t work on new technology, “there are a lot of other tools for law enforcement,” he said.

“There is more information available in 2016 than in any year since the founding of the country.”

Although law enforcement has growing expectations about using technology to thwart criminals, that type of power is too broad, Black added.

“If they are seeking a level of total surveillance capability, I don’t see a compromise available,” he said.

Wicker said that to give law enforcement access, Congress could in theory mandate that devices use automatic cloud backups that could not be disabled. But that would constitute a dramatic departure from current views about privacy.

“From an individual rights standpoint,” he said, “that would take away control by the user of their personal information.”

Apple and FBI to testify before Congress next week over encryption

Apple and FBI to testify before Congress next week over encryption

Over the past few days, Apple has made it abundantly clear that it will not comply with the FBI’s demand that it write a new piece of software to help bypass built-in iPhone security measures.

On the contrary, Apple has said that it wants the FBI to withdraw all of its demands while adding that the only way to move forward is to form a commission of experts on intelligence, technology, and civil liberties to discuss “the implications for law enforcement, national security, privacy, and personal freedoms.”

In the meantime, Apple has vehemently argued that Congress should be tasked with determining the fate of the shooter’s iPhone, not the courts. Come next Tuesday, Apple will finally be able to plead its case directly in front of our country’s lawmakers.

Earlier today, the House Judiciary Committee announced that it will be holding a congressional hearing on encryption on Tuesday, March 1. The hearing itself is called, “The Encryption Tightrope: Balancing Americans’ Security and Privacy.”

Slated to testify on the first panel is FBI director James Comey who, you might recall, recently penned a blogpost arguing that the current debate isn’t about the implications of encryption, but rather about “the victims and justice.”

On the second panel, Apple’s top lawyer, Bruce Sewell, will appear and present Apple’s case. Appearing alongside him will be Susan Landau, a cybersecurity expert, and New York District Attorney Cyrus R. Vance, Jr.

A statement from the House Judiciary Committee on the upcoming hearing reads as follows:

Apple and FBI to testify before Congress next week over encryption

This should undoubtedly make for a lively hearing.

Speaking to the seriousness with which Apple views this debate, Tim Cook yesterday said that helping the FBI would be tantamount to creating the “software equivalent of cancer.”

Here’s why the FBI forcing Apple to break into an iPhone is a big deal

iphone 4s

When U.S. Magistrate Sheri Pym ruled that Apple must help the FBI break into an iPhone belonging to one of the killers in the San Bernardino, Calif., shootings, the tech world shuddered.

Why? The battle of encryption “backdoors” has been longstanding in Silicon Valley, where a company’s success could be made or broken based on its ability to protect customer data.

The issue came into the spotlight after Edward Snowden disclosed the extent to which technology and phone companies were letting the U.S. federal government spy on data being transmitted through their network.

Since Edward Snowden’s whistleblowing revelations, Facebook, Apple and Twitter have unilaterally said they are not going to create such backdoors anymore.

So here’s the “backdoor” the FBI wants: Right now, iPhone users have the option to set a security feature that only allows a certain number of tries to guess the correct passcode to unlock the phone before all the data on the iPhone is deleted. It’s a security measure Apple put in place to keep important data out of the wrong hands.

Federal prosecutors looking for more information behind the San Bernardino shootings don’t know the phone’s passcode. If they guess incorrectly too many times, the data they hope to find will be deleted.

That’s why the FBI wants Apple to disable the security feature. Once the security is crippled, agents would be able to guess as many combinations as possible.

Kurt Opsahl, general counsel for the Electronic Frontier Foundation, a San Francisco-based digital rights non-profit, explained that this “backdoor” means Apple will have to to write brand new code that will compromise key features of the phone’s security. Apple has five business days to respond to the request.

What does Apple have to say about this? They haven’t commented yet today, but back in December, Apple CEO Tim Cook defended the company’s use of encryption on its mobile devices, saying users should not have to trade privacy for national security, in a broad interview with 60 Minutes. In the interview, Cook stood by the company’s stance of refusing to offer encrypted texts and messages from users.

He said: “There’s likely health information, there’s financial information,” says Cook describing a user’s iPhone. “There are intimate conversations with your family, or your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It’s because if there’s a way to get in, then somebody will find the way in.”

Cook says Apple cooperates with law enforcement requests, but can’t access encrypted information on users’ smartphones. According to a page on Apple’s website detailing government requests, Apple says encryption data is tied to the device’s passcode.

Cook also dismissed the idea that iPhone users should swap privacy for security. “We’re America. We should have both.”

What does this mean for the next time the government wants access? The order doesn’t create a precedent in the sense that other courts will be compelled to follow it, but it will give the government more ammunition.

What do digital rights experts have to say? There are two things that make this order very dangerous, Opsahl said. The first is the question is raises about who can make this type of demand. If the U.S. government can force Apple to do this, why can’t the Chinese or Russian governments?

The second is that while the government is requesting a program to allow it to break into this one, specific iPhone, once the program is created it will essentially be a master key. It would be possible for the government to take this key, modify it and use it on other phones. That risks a lot, that the government will have this power and it will not be misused, he said.

And the lawmakers? Well, they are torn. Key House Democrat, Rep. Adam Schiff, D-Calif., says Congress shouldn’t force tech companies to have encryption backdoors. Congress is struggling with how to handle the complex issue.

On the other side of things, Senate Intelligence Committee Chairman Richard Burr, R-N.C., and Vice Chair Dianne Feinstein, D-Calif., say they want to require tech companies to provide a backdoor into encrypted communication when law enforcement officials obtain a court order to investigate a specific person.

What now? This could push the tech companies to give users access to unbreakable encryption. To some extent, it’s already happening. Companies like Apple and Google — responding to consumer demands for privacy — have developed smart phones and other devices with encryption that is so strong that even the companies can’t break it.

Encryption: if this is the best his opponents can do, maybe Jim Comey has a point

  • “We share EPA’s commitment to ending pollution,” said a group of utility executives. “But before the government makes us stop burning coal, it needs to put forward detailed plans for a power plant that is better for the environment and just as cheap as today’s plants. We don’t think it can be done, but we’re happy to consider the government’s design – if it can come up with one.”
  • “We take no issue here with law enforcement’s desire to execute lawful surveillance orders when they meet the requirements of human rights and the rule of law,” said a group of private sector encryption experts, “Our strong recommendation is that anyone proposing regulations should first present concrete technical requirements, which industry, academics, and the public can analyze for technical weaknesses and for hidden costs.”
  • “Building an airbag that doesn’t explode on occasion is practically impossible,” declared a panel of safety researchers who work for industry. “We have no quarrel with the regulators’ goal of 100% safety. But if the government thinks that goal is achievable, it needs to present a concrete technical design for us to review. Until then, we urge that industry stick with its current, proven design.”

Which of these anti-regulation arguments is being put forward with a straight face today? Right. It’s the middle one. Troubled by the likely social costs of ubiquitous strong encryption, the FBI and other law enforcement agencies are asking industry to ensure access to communications and data when the government has a warrant. And their opponents are making arguments that would be dismissed out of hand if they were offered by any other industry facing regulation.

Behind the opponents’ demand for “concrete technical requirements” is the argument that any method of guaranteeing government access to encrypted communications should be treated as a security flaw that inevitably puts everyone’s data at risk. In principle, of course, adding a mechanism for government access introduces a risk that the mechanism will not work as intended. But it’s also true that adding a thousand lines of code to a program will greatly increase the risk of adding at least one security flaw to the program. Yet security experts do not demand that companies stop adding code to their programs. The cost to industry of freezing innovation is deemed so great that the introduction of new security flaws must be tolerated and managed with tactics such as internal code reviews, red-team testing, and bug bounties.

That same calculus should apply to the FBI’s plea for access. There are certainly social and economic costs to giving perfect communications and storage security to everyone – from the best to the worst in society. Whether those costs are so great that we should accept and manage the risks that come with government access is a legitimate topic for debate.

Unfortunately, if you want to know how great those risks are, you can’t really rely on mainstream media, which is quietly sympathetic to opponents of the FBI, or on the internet press, which doesn’t even pretend to be evenhanded on this issue. A good example is the media’s distorted history of NSA’s 1994 Clipper chip. That chip embodied the Clinton administration’s proposal for strong encryption that “escrowed” the encryption keys to allow government access with a warrant.

(Full disclosure: the Clipper chip helped to spur the Crypto War of the 1990s, in which I was a combatant on the government side. Now, like a veteran of the Great War, I am bemused and a little disconcerted to find that the outbreak of a second conflict has demoted mine to “Crypto War I.”)

The Clipper chip and its key escrow mechanism were heavily scrutinized by hostile technologists, and one, Matthew Blaze,discovered that it was possible with considerable effort to use the encryption offered by the chip while bypassing the mechanism that escrowed the key and thus guaranteed government access. Whether this flaw was a serious one can be debated. (Bypassing escrow certainly took more effort than simply downloading and using an unescrowed strong encryption program like PGP, so the flaw may have been more theoretical than real.) In any event, nothing about Matt Blaze’s paper questioned the security being offered by the chip, as his paper candidly admitted.  Blaze said, “None of the methods given here permit an attacker to discover the contents of encrypted traffic or compromise the integrity of signed messages. Nothing here affects the strength of the system from the point of view of the communicating parties.” In other words, he may have found a flaw in the Clipper chip, but not in the security it provided to users.

The press has largely ignored Blaze’s caveat.  It doesn’t fit the anti-FBI narrative, which is that government access always creates new security holes. I don’t think it’s an accident that no one talks these days about what Matt Blaze actually found except to say that he discovered “security flaws” in Clipper.  This formulation allows the reader to (falsely) assume that Blaze’s research shows that government access always undermines security.

The success of this tactic is shown by the many journalists who have fallen prey to this false assumption.  Among the reporters fooled by this line Craig Timberg of the Washington Post,“The eventually failed amid political opposition but not before Blaze … discovered that the “Clipper Chip” produced by the NSA had crucial security flaws. It turned out to be a back door that a skilled hacker could easily break through.” Also taken in was Nicole Perlroth of the New York Times: “The final blow [to Clipper]was the discovery by Matt Blaze… of a flaw in the system that would have allowed anyone with technical expertise to gain access to the key to Clipper-encrypted communications.”

To her credit, Nicole Perlroth tells me that the New York Times will issue a correction after a three-way Twitter exchange between me, her, and Matt Blaze. But the fact that the error has also cropped up in the Washington Post suggests a larger problem: Reporters are so sympathetic to one side of this debate that we simply cannot rely on them for a straight story on the security risks of government access.

Privacy advocates and tech giants support encryption, which the FBI director finds “depressing”

Privacy advocates and tech giants support encryption, which the FBI director finds “depressing”

There’s a privacy battle brewing between the FBI and other federal government groups on one side, and tech companies, cryptologists, privacy advocates (and some elected American lawmakers) on the other.

Basically, the FBI (circa-2015 edition) opposes the use of encryption to keep data secure from hackers, on the grounds that the government couldn’t get at it either.

So this week, a wide variety of organizations ranging from civil-liberty groups and privacy advocates to tech companies and trade associations to security and policy experts sent President Obama an open letter urging him to reject any legislation that would outlaw secure encryption:

Privacy advocates and tech giants support encryption, which the FBI director finds “depressing”

Change of heart

The FBI used to take the same view: encryption is a good way for innocent people to protect themselves and their personal data from criminals, so if encryption is available to you, you should use it.

In October 2012, the FBI’s “New E-Scams and Warnings” website even published an article warning that “Smartphone Users Should be Aware of Malware Targeting Mobile Devices and Safety Measures to Help Avoid Compromise.” That article included a bullet-pointed list of “Safety tips to protect your mobile device.”

And the second tip on the list says this: “Depending on the type of phone, the operating system may have encryption available. This can be used to protect the user’s personal data in the case of loss or theft.”

But in September 2013, when current FBI director James Comey took over the bureau, he also took a very different view of encryption: he thinks it only benefits criminals.

“Very dark place”

For example, when Apple launched its iPhone 6 last September, it bragged about the phone’s strong security features, including automatic data encryption. Comey then predicted that encrypted communications could lead to a “very dark place,” and criticized “companies marketing something expressly to allow people to place themselves beyond the law” (as opposed to, say, “Marketing something expressly so people know hackers can’t steal photographs, financial information and other personal data off their phones”).

Comey went so far as to suggest that Congress make data encryption illegal via rewriting the 20-year-old Communications Assistance in Law Enforcement Act to make it cover apps and other technologies which didn’t exist back in 1994.

And this week, in response to the tech companies’ and privacy advocates’ open letter to President Obama, Comey said he found the letter depressing: “I frankly found it depressing because their letter contains no [acknowledgment] that there are societal costs to universal encryption …. All of our lives, including the lives of criminals and terrorist and spies, will be in a place that is utterly unavailable to the court-ordered process. That, I think, to a democracy should be very concerning.”

Get a warrant

Yet despite Comey’s concerns, the idea that encryption would make it utterly impossible for police and courts to stop angerous criminals is not true. Even with encryption, police or the FBI can still get data off your phone; they just can’t do it without your knowledge. As Jose Pagliary pointed out:

Privacy advocates and tech giants support encryption, which the FBI director finds “depressing”

That’s what FBI Director James Comey finds “depressing,” or likely to lead to a “very dark place”: the idea that if the government wants access to your personal data, it still has to get a warrant first.

FBI Quietly Removes Recommendation To Encrypt Your Phone… As FBI Director Warns How Encryption Will Lead To Tears

FBI Quietly Removes Recommendation To Encrypt Your Phone... As FBI Director Warns How Encryption Will Lead To Tears

Back in October, we highlighted the contradiction of FBI Director James Comey raging against encryption and demanding backdoors, while at the very same time the FBI’s own website was suggesting mobile encryption as a way to stay safe. Sometime after that post went online, all of the information on that page about staying safe magically disappeared, though thankfully I screenshotted it at the time:

If you really want, you can still see that information over at the Internet Archive or in a separate press release the FBI apparently didn’t track down and memory hole yet. Still, it’s no surprise that the FBI quietly deleted that original page recommending that you encrypt your phones “to protect the user’s personal data,” because the big boss man is going around spreading a bunch of scare stories about how we’re all going to be dead or crying if people actually encrypted their phones:

Calling the use of encrypted phones and computers a “huge problem” and an affront to the “rule of law,” Comey, painted an apocalyptic picture of the world if the communications technology isn’t banned.

“We’re drifting to a place where a whole lot of people are going to look at us with tears in their eyes,” he told the House Appropriations Committee, describing a hypothetical in which a kidnapped young girl’s phone is discovered but can’t be unlocked.

So, until recently, the FBI was actively recommending you encrypt your data to protect your safety — and yet, today it’s “an affront to the rule of law.” Is this guy serious?

More directly, this should raise serious questions about what Comey thinks his role is at the FBI (or the FBI’s role is for the country)? Is it to keep Americans safe — or is it to undermine their privacy and security just so it can spy on everyone?

Not surprisingly, Comey pulls out the trifecta of FUD in trying to explain why it needs to spy on everyone: pedophiles, kidnappers and drug dealers:

“Tech execs say privacy should be the paramount virtue,” Comey continued, “When I hear that I close my eyes and say try to image what the world looks like where pedophiles can’t be seen, kidnapper can’t be seen, drug dealers can’t be seen.”

Except we know exactly what that looks like — because that’s the world we’ve basically alwayslived with. And yet, law enforcement folks like the FBI and various police departments were able to use basic detective work to track down criminals.

If you want to understand just how ridiculous Comey’s arguments are, simply replace his desire for unencrypted devices with video cameras in every corner of your home that stream directly into the FBI. Same thing. Would that make it easier for the FBI to solve some crimes? Undoubtedly. Would it be a massive violation of privacy and put many more people at risk? Absolutely.

It’s as if Comey has absolutely no concept of a cost-benefit analysis. All “bad people” must be stopped, even if it means destroying all of our freedoms, based on what he has to say. That’s insane — and raises serious questions about his competence to lead a government agency charged with protecting the Constitution.