US officials target social media, encryption after Chattanooga shooting

Was the Chattanooga shooter inspired by IS propaganda? There’s no evidence to back the claim, but some officials are already calling for access to encrypted messages and social media monitoring. Spencer Kimball reports.

US officials target social media, encryption after Chattanooga shooting

It’s not an unusual story in America: A man in his 20s with an unstable family life, mental health issues and access to firearms goes on a shooting spree, shattering the peace of middle class life.

This time, the shooter’s name was Muhammad Youssef Abdulazeez, a Kuwaiti-born naturalized US citizen, the son of Jordanian parents of Palestinian descent. And he targeted the military.

Abdulazeez opened fire on a recruiting center and naval reserve facility in Chattanooga, Tennessee last Thursday. Four marines and a sailor, all unarmed, died in the attack.

But the picture that’s emerged from Chattanooga over the past several days is complicated, raising questions about mental health, substance abuse, firearms, religion and modernity.

Yet elected officials have been quick to suggest that events in Chattanooga were directly inspired by “Islamic State” (also known as ISIL or ISIS) Internet propaganda, though there’s still no concrete evidence to back up that claim.

“This is a classic lone wolf terrorist attack,” Senator Dianne Feinstein told US broadcaster CBS. “Last year, 2014, ISIL put out a call for people to kill military people, police officers, government officials and do so on their own, not wait for direction.”

And according to Feinstein, part of the solution is to provide the government with greater access to digital communications.

“It is now possible for people, if they’re going to talk from Syria to the United States or anywhere else, to get on an encrypted app which cannot be decrypted by the government with a court order,” Feinstein said.

Going dark

Two years ago, former NSA contractor Edward Snowden revealed the extent of US government surveillance to the public. Responding to public outcry in the wake of the NSA revelations, companies such as Facebook, Yahoo, Google and others stepped up efforts to encrypt users’ personal data.

But the Obama administration, in particular FBI Director James Comey, has expressed growing concern about encryption technology. Law enforcement argues that even with an appropriate court order they still cannot view communications masked by such technology. They call it “going dark.”

Feinstein and others believe that Internet companies have an obligation to provide law enforcement with a way to view encrypted communications, if there’s an appropriate court order. But according to Emma Llanso, that would only create greater security risks.

“If you create a vulnerability in your encryption system, you are creating a vulnerability that can be exploited by any malicious actor anywhere in the world,” Llanso, director of the Free Expression Project at the Center for Democracy and Technology, told DW.

Monitoring social media

It’s not just an issue of encryption technology. There’s also concern about how militant groups such as the “Islamic State” are using social media, in particular Twitter.

“This is the new threat that’s out there over the Internet that’s very hard to stop,” Representative Michael McCaul told ABC’s This Week. “We have over 200,000 ISIS tweets per day that hit the United States.

“If it can happen in Chattanooga, it can happen anywhere, anytime, any place and that’s our biggest fear,” added McCaul, the chairman of the House Homeland Security committee.

In the Senate, an intelligence funding bill includes a provision that would require Internet companies to report incidents of “terrorist activity” on their networks to authorities.

According to Llanso, such activity isn’t defined anywhere in the provision, which means companies would have an incentive to overreport in order to meet their obligations. And speech clearly protected by the US First Amendment can also lead to incitement, said Philip Seib, co-author of “Global Terrorism and New Media.”

“If somebody puts something up on Facebook that says Muslims are being oppressed in the Western world, maybe that’s an incentive to somebody to undertake a violent act,” Seib told DW. “But you can’t pull that down, that is a free speech issue.”

Islamist connections?

In the case of Chattanooga, it’s unclear how government access to encrypted communications or requiring social media reporting would have stopped the shooting. One of Abdulazeez’s friends told CNN that the 24-year-old actually opposed the “Islamic State,” calling it a “stupid group” that “was completely against Islam.”

But Abdulazeez was critical of US foreign policy and expressed a desire to become a martyr in his personal writings, according to CNN sources. The young man’s father was put on a terrorist watch list but was then cleared of allegedly donating money to a group tied to Hamas. Abdulazeez also spent seven months in Jordan visiting family in 2014.

He also reportedly viewed content related to radical cleric Anwar al-Awlaki. An American citizen, Awlaki was killed in 2011 by a US drone strike in Yemen for alleged ties to al Qaeda in the Arabian Peninsula.

“The Guardian” reported that just hours before the shooting spree, Abdulazeez sent a text message to a friend with a verse from the Koran: “Whosoever shows enmity to a friend of Mine, then I have declared war against him.”

Guns, drugs and depression

Abdulazeez reportedly suffered from depression and had suicidal thoughts. He abused alcohol and drugs, including marijuana and caffeine pills. He had recently been arrested and charged with driving under the influence, with a court date set for July 30. He also took muscle relaxants for back pain and sleeping pills for a night shift at a manufacturing plant, according to the Associated Press.

His family life was also unstable. In 2009, Abdulazeez’s mother filed for divorce, accusing his father of abuse. The two later reconciled, according to the “New York Times.”

And he had access to guns, including an AK-47 assault rifle. Abdulazeez liked to go shooting and hunting. He also participated in mixed martial arts.

Officials told ABC News that Abdulazeez had conducted Internet research on Islamist militant justifications for violence, perhaps hoping to find religious atonement for his problems.

“The campaigns by the Western governments – the US primarily, the Brits and others – have indicated that they don’t really understand what’s going on in the minds of many young Muslims,” Seib told DW.

“The Western efforts don’t ring true amongst many people they seek to reach because on issues such as human rights the Western governments don’t have much credibility,” he added.

After Washington Post rolls out HTTPS, its editorial board bemoans encryption debate

After Washington Post rolls out HTTPS, its editorial board bemoans encryption debate

There’s hope that by the time the Washington Post’s editorial board takes a third crack at the encryption whip, it might say something worthwhile.

Late on Saturday, the The Washington Post’s editorial board published what initially read as a scathing anti-encryption and pro-government rhetoric opinion piece that scolded Apple and Google (albeit a somewhat incorrect assertion) for providing “end-to-end encryption” (again,an incorrect assertion) on their devices, locking out federal authorities investigating serious crimes and terrorism.

Read to the end, and you’ll find the editorial came up with nothing.

It was a bland and mediocre follow-up to a similar opinion piece, which was called”staggeringly dumb” and “seriously embarrassing”for proposing a “golden key” to bypass encryption.

Critically, what the Post gets out of this editorial remains widely unknown, perhaps with the exception of riling up members of the security community. It’s not as though the company is particularly invested in either side. Aside the inaccuracies in the board’s opinion, and the fair (and accurate) accusation that the article said “nothing” (one assumes that means nothing of “worth” or “value”), it’s hypocritical to make more than one statement on this matter while at the same time becoming the first major news outlet to start encrypting its entire website.

The board’s follow-up sub-600 worded note did not offer anything new, but reaffirmed its desire to see both tech companies and law enforcement “reconcile the competing imperatives” for privacy and data access, respectively. (It’s worth noting the board’s opinion does not represent every journalist or reporter working at the national daily, but it does reflect the institution’s views on the whole.)

Distinguished security researcher Kenn White, dismissed the editorial in just three words: “Nope. No need.”

Because right now, there is no viable way allow both encrypted services while allowing police and federal agencies access to that scrambled information through so-called “backdoor” means. Just last week, a group of 13 of the world’s preeminent cryptographers and security researchers released a paper (which White linked to in his tweet) explaining that “such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend.”

In other words: if there’s a secret way in for the police and the feds, who’s to say a hacker won’t find it, too?

The Post’s own decision to roll out encryption across its site seems bizarre considering the editorial board’s conflicting views on the matter.

Such head-scratching naivety prompted one security expertto ask anyone who covers security at the Post to “explain reality” to the board. Because, clearly, the board isn’t doing its job well if on two separate occasions it’s fluffed up reporting on a subject with zero technical insight.

If the board, however, needs help navigating the topic, there is no doubt a virtual long line of security experts, academics, and researchers lining up around the block ready to assist. At least then there’s hope the board can strike it third-time lucky in covering the topic.

TeslaCrypt 2.0 comes with stronger encryption and a CryptoWall disguise

TeslaCrypt 2.0 comes with stronger encryption and a CryptoWall disguise

TeslaCrypt, primarily known for encrypting gaming files, has beefed up its techniques and most recently, greatly improved its encryption in its newest 2.0 version.

Kasperky Lab wrote in a blog post that TeslaCrypt 2.0 not only makes it impossible to decrypt files, but also uses an HTML page copied directly from a separate ransomware: CryptoWall. And to take it a step further, TeslaCrypt no longer uses its own name; it instead opts to disguise itself as CryptoWall.

More specifically, once infected, a victim is taken to an HTML payment page directly copied from CryptoWall. It only differs in that the URLs lead to TeslaCrypt’s Tor-based servers.

Fedor Sinitsyn, senior malware analyst at Kaspersky, said in emailed comments to SCMagazine.com that he couldn’t provide an answer as to why the gaming ransomware might be using this disguise, but he speculated it’s “aimed to scare the victim and to puzzle experts trying to help the victim.”

While TeslaCrypt might not be as notorious or recognizable as CryptoWall, the ransomware’s new encryption scheme could put it higher up on IT professionals’ threat radar. Previous versions saved data in a file that could be used to recover the decryption key, Sinitsyn said. This critical data isn’t saved in the system. Backups are more imperative than ever, and Sinitsyn emphasized that they are the best defense against ransomware attacks.

“System administrators should be in charge of corporate backup and be leading the process on the corporate level,” he said. “Also, they should educate their uses on how to protect themselves from ransomware.”

TeslaCrypt mainly spreads through exploit kits, including Angler, Sweet Orange and Nuclear, and a large portion of its infections have been in the U.S.

“Ransomware as a threat is growing, criminals develop new and sophisticated pieces of malware, and in many cases decryption of the attacked files is impossible,” Sinitsyn said. “If your data is valuable, please take your time to make reliable backup copies.”

Encryption: if this is the best his opponents can do, maybe Jim Comey has a point

  • “We share EPA’s commitment to ending pollution,” said a group of utility executives. “But before the government makes us stop burning coal, it needs to put forward detailed plans for a power plant that is better for the environment and just as cheap as today’s plants. We don’t think it can be done, but we’re happy to consider the government’s design – if it can come up with one.”
  • “We take no issue here with law enforcement’s desire to execute lawful surveillance orders when they meet the requirements of human rights and the rule of law,” said a group of private sector encryption experts, “Our strong recommendation is that anyone proposing regulations should first present concrete technical requirements, which industry, academics, and the public can analyze for technical weaknesses and for hidden costs.”
  • “Building an airbag that doesn’t explode on occasion is practically impossible,” declared a panel of safety researchers who work for industry. “We have no quarrel with the regulators’ goal of 100% safety. But if the government thinks that goal is achievable, it needs to present a concrete technical design for us to review. Until then, we urge that industry stick with its current, proven design.”

Which of these anti-regulation arguments is being put forward with a straight face today? Right. It’s the middle one. Troubled by the likely social costs of ubiquitous strong encryption, the FBI and other law enforcement agencies are asking industry to ensure access to communications and data when the government has a warrant. And their opponents are making arguments that would be dismissed out of hand if they were offered by any other industry facing regulation.

Behind the opponents’ demand for “concrete technical requirements” is the argument that any method of guaranteeing government access to encrypted communications should be treated as a security flaw that inevitably puts everyone’s data at risk. In principle, of course, adding a mechanism for government access introduces a risk that the mechanism will not work as intended. But it’s also true that adding a thousand lines of code to a program will greatly increase the risk of adding at least one security flaw to the program. Yet security experts do not demand that companies stop adding code to their programs. The cost to industry of freezing innovation is deemed so great that the introduction of new security flaws must be tolerated and managed with tactics such as internal code reviews, red-team testing, and bug bounties.

That same calculus should apply to the FBI’s plea for access. There are certainly social and economic costs to giving perfect communications and storage security to everyone – from the best to the worst in society. Whether those costs are so great that we should accept and manage the risks that come with government access is a legitimate topic for debate.

Unfortunately, if you want to know how great those risks are, you can’t really rely on mainstream media, which is quietly sympathetic to opponents of the FBI, or on the internet press, which doesn’t even pretend to be evenhanded on this issue. A good example is the media’s distorted history of NSA’s 1994 Clipper chip. That chip embodied the Clinton administration’s proposal for strong encryption that “escrowed” the encryption keys to allow government access with a warrant.

(Full disclosure: the Clipper chip helped to spur the Crypto War of the 1990s, in which I was a combatant on the government side. Now, like a veteran of the Great War, I am bemused and a little disconcerted to find that the outbreak of a second conflict has demoted mine to “Crypto War I.”)

The Clipper chip and its key escrow mechanism were heavily scrutinized by hostile technologists, and one, Matthew Blaze,discovered that it was possible with considerable effort to use the encryption offered by the chip while bypassing the mechanism that escrowed the key and thus guaranteed government access. Whether this flaw was a serious one can be debated. (Bypassing escrow certainly took more effort than simply downloading and using an unescrowed strong encryption program like PGP, so the flaw may have been more theoretical than real.) In any event, nothing about Matt Blaze’s paper questioned the security being offered by the chip, as his paper candidly admitted.  Blaze said, “None of the methods given here permit an attacker to discover the contents of encrypted traffic or compromise the integrity of signed messages. Nothing here affects the strength of the system from the point of view of the communicating parties.” In other words, he may have found a flaw in the Clipper chip, but not in the security it provided to users.

The press has largely ignored Blaze’s caveat.  It doesn’t fit the anti-FBI narrative, which is that government access always creates new security holes. I don’t think it’s an accident that no one talks these days about what Matt Blaze actually found except to say that he discovered “security flaws” in Clipper.  This formulation allows the reader to (falsely) assume that Blaze’s research shows that government access always undermines security.

The success of this tactic is shown by the many journalists who have fallen prey to this false assumption.  Among the reporters fooled by this line Craig Timberg of the Washington Post,“The eventually failed amid political opposition but not before Blaze … discovered that the “Clipper Chip” produced by the NSA had crucial security flaws. It turned out to be a back door that a skilled hacker could easily break through.” Also taken in was Nicole Perlroth of the New York Times: “The final blow [to Clipper]was the discovery by Matt Blaze… of a flaw in the system that would have allowed anyone with technical expertise to gain access to the key to Clipper-encrypted communications.”

To her credit, Nicole Perlroth tells me that the New York Times will issue a correction after a three-way Twitter exchange between me, her, and Matt Blaze. But the fact that the error has also cropped up in the Washington Post suggests a larger problem: Reporters are so sympathetic to one side of this debate that we simply cannot rely on them for a straight story on the security risks of government access.

In The Debate Over Strong Encryption, Security And Liberty Must Win

When Sen. Chuck Grassley (R-Iowa) gaveled a Senate Judiciary Committee hearing into session on Wednesday, he called it the “start” of a conversation about privacy, security and encryption. Frankly, it was just the latest forum for a much older discussion.

While it may have been the beginning of a long day on Capitol Hill for FBI Director James Comey, the national conversation about law enforcement and strong encryption has been ongoing since the 1990s and the so-called “Crypto Wars.” While the debate now has a charged geopolitical context, includes the biggest tech companies on the planet and involves smartphone encryption, it’s not a new one.

No crytographers testified at Wednesday’s hearing. If one had been present, he or she might have told the representatives of the Federal Bureau of Investigation and the Justice Department that what they were asking Silicon Valley to develop — retaining the capacity to respond to lawful orders by providing data from computer systems with end-to-end encryption — wasn’t technically feasible in a way that didn’t fundamentally compromise the security of those systems.

If any of the 15 experts in cryptography that authored a new white paper on encryption had been called to testify, they likely would have made that case:

In the wake of the growing economic and social cost of the fundamental insecurity of today’s Internet environment, any proposals that alter the security dynamics online should be approached with caution. Exceptional access would force Internet system developers to reverse forward secrecy design practices that seek to minimize the impact on user privacy when systems are breached. The complexity of today’s Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws. Beyond these and other technical vulnerabilities, the prospect of globally deployed exceptional access systems raises difficult problems about how such an environment would be governed and how to ensure that such systems would respect human rights and the rule of law.

The FBI and Justice Department may want the tech industry to “try harder” and give a “full, honest effort” to provide a technological way to provide access to encrypted information, but the tech industry isn’t biting.

“Proposals to mandate weakened encryption would undermine security and end user confidence in the Internet without any clear national security benefits,” said Abigail Slater, the vice president of legal and regulatory policy at the Internet Association.

“Strong encryption protects billions of global end users from countless privacy threats ranging from financial fraud to repressive governments stifling speech and democracy. Instead of forcing

companies to lower their security standards, policymakers should promote and protect the wide adoption of strong encryption technology.”

In his spoken testimony, Comey said, “There is no such thing as secure: There’s only more secure and less secure.”

Of that, there is no doubt. “Split key encryption,” where digital master keys to unlock encrypted data or systems are held in escrow, is less secure, just as it was when government officials proposed it nearly two decades ago.

The Justice Department and FBI may want to have a debate on encryption, but they’ve been dealt a losing hand at this table.

As law professor Peter Swire testified later in the Senate hearing, the review group on intelligence and communications technologies that President Barack Obama convened in August 2013 unequivocally recommended supporting strong encryption in its report on liberty and security later that year:

The US Government should take additional steps to promote security, by (1) fully supporting and not undermining efforts to create encryption standards; (2) making clear that it will not in any way subvert, undermine, weaken, or make vulnerable generally available commercial encryption; and (3) supporting efforts to encourage the greater use of encryption technology for data in transit, at rest, in the cloud, and in storage.

That conclusion is anything but isolated, as Kevin Bankston, the director of the Open Technology Institute at the New America Foundation, pointed out in an essay Tuesday:

…the broad consensus outside of the FBI is that the societal costs of such surveillance backdoors — or “front doors,” as Comey prefers to call them — far outweigh the benefits to law enforcement, and that strong encryption will ultimately prevent more crimes than it obscures.

Tech companies, privacy advocates, security experts, policy experts, all five members of President Obama’s handpicked Review Group on Intelligence and Communications Technologies, UN human rights experts, and a majority of the House of Representatives all agree: Government-mandated backdoors are a bad idea. There are countless reasonswhy this is true, including: They would unavoidably weaken the security of our digital data, devices, and communications even as we are in the midst of a cybersecurity crisis; they would cost the US tech industry billions as foreign customers — including many of the criminals Comey hopes to catch — turn to more secure alternatives; and they would encourage oppressive regimes that abuse human rights to demand backdoors of their own.

Bankston is no zealot, nor has he impugned the honor, intentions or distinguished public service record of Comey, who has notably stood on the side of civil liberties in his career.
What Bankston and many others are saying, and have been saying for years, however, is that protecting the privacy of citizens from those who would do them harm or steal from them is now intrinsically bound to encrypting devices, communications and data.

That’s true whether for cellphones, email, health records, tax transcripts or the of  tens of millions of public servants.

This isn’t a competition between privacy and security or a choice between opposing value systems: it’s security and security, and on the line is the capacity of democratic societies to do investigative journalism, engage in digital commerce or securely make transactions with government.

It’s fair to acknowledge that the FBI may have a diminished capacity to conduct some investigations as a result, but in striking an appropriate balance between safety and liberty, that is sometimes the outcome.

FBI chief wants ‘backdoor access’ to encrypted communications to fight Isis

FBI chief wants 'backdoor access' to encrypted communications to fight Isis

The director of the Federal Bureau of Investigation has warned US senators that the threat from the Islamic State merits a “debate” about limiting commercial encryption – the linchpin of digital security – despite a growing chorus of technical experts who say that undermining encryption would prove an enormous boon for hackers, cybercriminals, foreign spies and terrorists.

In a twin pair of appearances before the Senate’s judiciary and intelligence committees on Wednesday, James Comey testified that Isis’s use of end-to-end encryption, whereby the messaging service being used to send information does not have access to the decryption keys of those who receive it, helped the group place a “devil” on the shoulders of potential recruits “saying kill, kill, kill, kill”.

Comey said that while the FBI is thus far disrupting Isis plots, “I cannot see me stopping these indefinitely”. He added: “I am not trying to scare folks.”

Since October, following Apple’s decision to bolster its mobile-device security, Comey has called for a “debate” about inserting “back doors” – or “front doors”, as he prefers to call them – into encryption software, warning that “encryption threatens to lead us all to a very, very dark place.”

But Comey and deputy attorney general Sally Quillian Yates testified that they do not at the moment envision proposing legislation to mandate surreptitious or backdoor access to law enforcement. Both said they did not wish the government to itself hold user encryption keys and preferred to “engage” communications providers for access, though technicians have stated that what Comey and Yates seek is fundamentally incompatible with end-to-end encryption.

Comey, who is not a software engineer, said his response to that was: “Really?” He framed himself as an advocate of commercial encryption to protect personal data who believed that the finest minds of Silicon Valley can invent new modes of encryption that can work for US law enforcement and intelligence agencies without inevitably introducing security flaws.

While the FBI director did not specifically cite which encrypted messaging apps Isis uses, the Guardian reported in December that its grand mufti used WhatsAppto communicate with his former mentor. WhatsApp adopted end-to-end encryption last year.

“I think we need to provide a court-ordered process for obtaining that data,” said Dianne Feinstein, the California Democrat and former intelligence committee chair who represents Silicon Valley.
But Comey’s campaign against encryption has run into a wall of opposition from digital security experts and engineers. Their response is that there is no technical way to insert a back door into security systems for governments that does not leave the door ajar for anyone – hackers, criminals, foreign intelligence services – to exploit and gain access to enormous treasure troves of user data, including medical records, financial information and much more.

The cybersecurity expert Susan Landau, writing on the prominent blog Lawfare, called Comey’s vision of a security flaw only the US government could exploit “magical thinking”.

Comey is aided in his fight against encryption by two allies, one natural and the other accidental. The natural ally is the National Security Agency director, Michael Rogers, who in February sparred with Yahoo’s chief of information security when the Yahoo official likened the anti-crypto push to “drilling a hole in the windshield”, saying: “I just believe that this is achievable. We’ll have to work our way through it.” The Guardian, thanks to Edward Snowden’s disclosures, revealed in September 2013 that the NSA already undermines encryption.

The less obvious ally is China, whom the FBI blamed last month for stealing a massive hoard of federal personnel data.

In May, China unveiled a national security law calling for “secure and controllable” technologies, something US and foreign companies fear is a prelude to a demand for backdoor entry into companies’ encryption software or outright provision of encryption keys.

Without ever mentioning his own FBI director’s and NSA director’s similar demands, Barack Obama castigated China’s anti-encryption push in March. Obama has also declined to criticize efforts in the UK, the US’s premier foreign ally, to undermine encryption. Prime minister David Cameron is proposing to introduce legislation in the autumn to force companies such as Apple, Google and Microsoft to provide access to encrypted data.

Under questioning from some skeptical senators, Comey made a number of concessions. When Ron Wyden, an Oregon Democrat, asked if foreign countries would attempt to mandate similar access, Comey replied, “I think they might.” The director acknowledged that foreign companies, exempt from any hypothetical US mandate, would be free to market encryption software.
In advance of Comey’s testimony, several of the world’s leading cryptographers, alarmed by the return of a battle they thought won during the 1990s “Crypto Wars”, rejected the effort as pernicious from a security perspective and technologically illiterate.

A paper they released on Tuesday, called “Keys Under Doormats”, said the transatlantic effort to insert backdoors into encryption was “unworkable in practice, raise[s] enormous legal and ethical questions, and would undo progress on security at a time when internet vulnerabilities are causing extreme economic harm”.

Asked by Feinstein if the experts had a point, Comey said: “Maybe. If that’s the case, I guess we’re stuck.”

Kevin Bankston of the New America Foundation called into question the necessity of Comey’s warnings that encryption would lead to law enforcement “going dark” against threats. Bankston, in a Tuesday blogpost, noted that the government’s latest wiretap disclosure found that state and federal governments could not access four encrypted conversations out of 3,554 wiretapped in 2014.

Yet Yates said both that the Justice Department was “increasingly” facing the encryption challenge and that she lacked the data quantifying how serious the challenge was. Yates told the Senate judiciary committee that law enforcement declined to seek warrants in cases of encrypted communications and did not say how often it made such a decision.

Software developers are not carrying out encryption properly

Software developers are not carrying out encryption properly

Despite a big push over the past few years to use encryption to combat security breaches, lack of expertise among developers and overly complex libraries have led to widespread implementation failures in business applications.

The scale of the problem is significant. Cryptographic issues are the second most common type of flaws affecting applications across all industries, according to a report this week by application security firm Veracode.

The report is based on static, dynamic and manual vulnerability analysis of over 200,000 commercial and self-developed applications used in corporate environments.

Cryptographic issues ranked higher in prevalence than historically common flaws like cross-site scripting, SQL injection and directory traversal. They included things like improper TLS (Transport Layer Security) certificate validation, cleartext storage of sensitive information, missing encryption for sensitive data, hard-coded cryptographic keys, inadequate encryption strength, insufficient entropy, non-random initialization vectors, improper verification of cryptographic signatures, and more.

The majority of the affected applications were Web-based, but mobile apps also accounted for a significant percentage.

Developers are adding a lot of crypto to their code, especially in sectors like health care and financial services, but they’re doing it poorly, said Veracode CTO Chris Wysopal.

Many organizations need to use encryption because of data protection regulations, but the report suggests their developers don’t have the necessary training to implement it properly. “It goes to show how hard it is to implement cryptography correctly,” Wysopal said. “It’s sort of an endemic issue that a lot of people don’t think about.”

Many developers believe they know how to implement crypto, but they haven’t had any specific training in cryptography and have a false sense of security, he said. Therefore, even though they end up with applications where encryption is present, so they can tick that checkbox, attackers are still able to get at sensitive data.

And that doesn’t even touch on cases where developers decide to create their own crypto algorithms, a bad idea that’s almost always destined to fail. Veracode only tested implementations that used standard cryptographic APIs (application programming interfaces) offered by programming languages like Java and .NET or popular libraries like OpenSSL.

Programming languages like Java and .NET try to protect developers from making errors more than older languages like C, said Carsten Eiram, the chief research officer at vulnerability intelligence firm Risk Based Security, via email.

“However, many people argue that since modern languages are easier to program in and protect programmers more from making mistakes, more of them may be lulled into a false sense of security and not show proper care when coding, i.e. increasing the risk of introducing other types of problems like design and logic errors. Not implementing crypto properly would fall into that category,” Eiram said.

Too many programmers think that they can just link to a crypto library and they’re done, but cryptography is hard to implement robustly if you don’t understand the finer aspects of it, like checking certificates properly, protecting the encryption keys, using appropriate key sizes or using strong pseudo-random number generators.

“All this ultimately comes down to better education of programmers to understand all the pitfalls when implementing strong crypto,” Eiram said.

But it’s not only the developers’ fault. Matthew Green, a professor of cryptography engineering at Johns Hopkins University in Baltimore, thinks that many crypto libraries are “downright bad” from a usability perspective because they’ve been designed by and for cryptographers. “Forcing developers to use them is like expecting someone to fly an airplane when all they have is a driver’s license,” he said via email.

Green believes that making cryptographic software easier to use — ideally invisible so that people don’t even have to think about it — would be a much more efficient approach than training developers to be cryptographers.

“We don’t expect developers to re-implement TCP [a core Internet protocol] or the entire file system every time they write something,” he said. “The fact that current crypto APIs are so bad is just a reflection of the fact that crypto, and security in general, are less mature than those other technologies.”

The authors of some cryptographic libraries are aware that their creations should be easier to use. For example, the OpenSSL project’s roadmap, published last June, lists reducing API complexity and improving documentation as goals to be reached within one year. While not disputing that some crypto libraries are overly complex, Eiram doesn’t agree that developers need to be cryptographers in order to implement crypto correctly.

The crypto APIs in Java and .NET — the programming languages most used by the apps covered in Veracode’s report — were designed specifically for developers and provide most of what they need in terms of crypto features when developing applications in those languages, Eiram said.

“While it’s always preferable that libraries including crypto libraries are made to be used as easily as possible, the programmers using them ultimately need to at least understand on a high level how they work,” he said. “I really see it as a two-way street: Make crypto as easy to use as possible, but programmers having to implement crypto in applications should also properly educate themselves instead of hoping for someone to hold their hand.”

In addition to the lack of crypto expertise among developers and the complexity of some crypto libraries, forgetting to turn security features back on after product testing is another common source of failures, according to Green. For example, developers will often turn off TLS certificate validation in their testing environments because they don’t have a valid certificate installed on their test servers, but then forget to turn it back on when the product moves into production.

“There was a paper a couple of years back that found a huge percentage of Android applications were making mistakes like this, due to a combination of interface confusion and testing mistakes,” Green said.

The failure to properly validate TLS certificates was commonly observed by Veracode during their application security tests, according to Wysopal, and the CERT Coordination Center at Carnegie Mellon University has found that a lot of Android applications have the same problem.

Over the past few years there’s been a strong push to build encryption both into consumer applications, in response to revelations of mass Internet surveillance by intelligence agencies, and into enterprise software, in response to the increasing number of data breaches. But while everyone, from the general public to the government, seems to agree that encryption is important and we should have more of it, little attention is being paid to how it’s actually implemented into products.

If the situation doesn’t improve, we risk ending up with a false sense of security. We’ll have encryption built into everything, but it will be broken and our sensitive data will still be vulnerable to spies and would-be thieves.

Privacy advocates and tech giants support encryption, which the FBI director finds “depressing”

Privacy advocates and tech giants support encryption, which the FBI director finds “depressing”

There’s a privacy battle brewing between the FBI and other federal government groups on one side, and tech companies, cryptologists, privacy advocates (and some elected American lawmakers) on the other.

Basically, the FBI (circa-2015 edition) opposes the use of encryption to keep data secure from hackers, on the grounds that the government couldn’t get at it either.

So this week, a wide variety of organizations ranging from civil-liberty groups and privacy advocates to tech companies and trade associations to security and policy experts sent President Obama an open letter urging him to reject any legislation that would outlaw secure encryption:

Privacy advocates and tech giants support encryption, which the FBI director finds “depressing”

Change of heart

The FBI used to take the same view: encryption is a good way for innocent people to protect themselves and their personal data from criminals, so if encryption is available to you, you should use it.

In October 2012, the FBI’s “New E-Scams and Warnings” website even published an article warning that “Smartphone Users Should be Aware of Malware Targeting Mobile Devices and Safety Measures to Help Avoid Compromise.” That article included a bullet-pointed list of “Safety tips to protect your mobile device.”

And the second tip on the list says this: “Depending on the type of phone, the operating system may have encryption available. This can be used to protect the user’s personal data in the case of loss or theft.”

But in September 2013, when current FBI director James Comey took over the bureau, he also took a very different view of encryption: he thinks it only benefits criminals.

“Very dark place”

For example, when Apple launched its iPhone 6 last September, it bragged about the phone’s strong security features, including automatic data encryption. Comey then predicted that encrypted communications could lead to a “very dark place,” and criticized “companies marketing something expressly to allow people to place themselves beyond the law” (as opposed to, say, “Marketing something expressly so people know hackers can’t steal photographs, financial information and other personal data off their phones”).

Comey went so far as to suggest that Congress make data encryption illegal via rewriting the 20-year-old Communications Assistance in Law Enforcement Act to make it cover apps and other technologies which didn’t exist back in 1994.

And this week, in response to the tech companies’ and privacy advocates’ open letter to President Obama, Comey said he found the letter depressing: “I frankly found it depressing because their letter contains no [acknowledgment] that there are societal costs to universal encryption …. All of our lives, including the lives of criminals and terrorist and spies, will be in a place that is utterly unavailable to the court-ordered process. That, I think, to a democracy should be very concerning.”

Get a warrant

Yet despite Comey’s concerns, the idea that encryption would make it utterly impossible for police and courts to stop angerous criminals is not true. Even with encryption, police or the FBI can still get data off your phone; they just can’t do it without your knowledge. As Jose Pagliary pointed out:

Privacy advocates and tech giants support encryption, which the FBI director finds “depressing”

That’s what FBI Director James Comey finds “depressing,” or likely to lead to a “very dark place”: the idea that if the government wants access to your personal data, it still has to get a warrant first.

Flawed encryption leaves millions of smart grid devices at risk of cyberattacks

Flawed encryption leaves millions of smart grid devices at risk of cyberattacks

Millions of smart meters, thermostats, and other internet-connected devices are at risk of cyberattacks because they come with easily crackable encryption, a study has warned.

A paper by Philipp Jovanovic and Samuel Neves published in late April analyzed the cryptography used in the Open Smart Grid Protocol (OSGP), a group of specifications published by a European telecoms standards body. The protocol is used in more than four million devices, and said to be one of the most widely used protocols for smart devices today.

The results? Not great.

The researchers found that the “weak cryptography” can easily be cracked through a series of relatively simple attacks. In one case, the researchers said they could “completely” defeat a device’s cryptography.

The most common and trusted encryption standards use well-established, peer-reviewed cyphers that are open-source and readily available to inspect. Some have argued it’s the “first rule” of crypto-club. The problem for smart grid devices is that they don’t stand up to the scrutiny of the community.

The OSGP Alliance, the non-profit group behind the OSGP protocol, said last month it’s preparing an update to the specifications to add new security features.

“The alliance’s work on this security update is motivated by the latest recommended international cybersecurity practices, and will enhance both the primitives used for encryption and authentication as well as the key length, usage, and update rules and mechanisms,” the post read.

We reached out to the OSGP Alliance, but did not hear back outside business hours.

Key management is the biggest pain of encryption

Key management is the biggest pain of encryption

Most IT professionals rate the pain of managing encryption keys as severe, according to a new global survey by the Ponemon Institute.

On a scale of 1 to 10, respondents said that the risk and cost associated with managing keys or certificates was 7 or above, and cited unclear ownership of keys as the main reason. “There’s a growing awareness of the security benefits of encryption really accrue from the keys,” said Richard Moulds, vice president of product strategy at Thales e-Security, the sponsor of this report. “The algorithms that encrypt the data are all the same — what makes it secure is the keys.”

MORE ON CSO: What is wrong with this picture? The NEW clean desk test

But as organizations use more encryption, they also end up with more keys, and more varieties of keys.

“In some companies, you might have millions of keys,” he said. “And every day, you generate more keys and they have to be managed and controlled. If the bad guy gets access to the keys, he gets access to the data. And if the keys get lost, you can’t access the data.”

Other factors that contributed to the pain were fragmented and isolated systems, lack of skilled staff, and inadequate management tools. And it’s hurting worse than before. “The proportion of people that rate it as higher levels of perceived pain is higher than last year,” said Moulds.

One reason that pain is increasing could be that encryption is becoming more ubiquitous, he said, embraced by industries and companies new to the challenges of managing keys and certificates.

According to the survey, which is now in its 10th year, the proportion of companies with no encryption strategy has declined from 38 percent in 2005 to 15 percent today. Meanwhile, the share of companies with an encryption strategy applied consistently across the entire enterprise has grown from 15 percent to 36 percent. The biggest growth last year was in healthcare and retail, two sectors hit by major public security breaches.

In the health and pharmaceutical industry, the share of companies with extensive use of encryption jumped from 31 to 40 percent. In retail, it rose from 21 to 26 percent. However, for the first time in the history of the survey, the proportion of the IT budget going to encryption has dropped. Between 2005 and 2013, it climbed steadily from 9.7 percent to 18.2 percent, but dropped to 15.7 percent in this year’s report.

The biggest driver for encryption was compliance, with 64 percent of respondents saying that they used encryption because of privacy or data security regulations or requirements.

Avoiding public disclosure after a data breach occurs was only cited as a driving factor by 9 percent of the respondents. Data residency, in which some countries allow protected data to leave national borders only if it’s encrypted, didn’t even make the list.

“It didn’t rank as high on the list of motivators as you would have thought,” said Moulds. “But data residency is an increasing driver, and I think it’s going to be a big driver in the future.”