Despite a big push over the past few years to use encryption to combat security breaches, lack of expertise among developers and overly complex libraries have led to widespread implementation failures in business applications.
The scale of the problem is significant. Cryptographic issues are the second most common type of flaws affecting applications across all industries, according to a report this week by application security firm Veracode.
The report is based on static, dynamic and manual vulnerability analysis of over 200,000 commercial and self-developed applications used in corporate environments.
Cryptographic issues ranked higher in prevalence than historically common flaws like cross-site scripting, SQL injection and directory traversal. They included things like improper TLS (Transport Layer Security) certificate validation, cleartext storage of sensitive information, missing encryption for sensitive data, hard-coded cryptographic keys, inadequate encryption strength, insufficient entropy, non-random initialization vectors, improper verification of cryptographic signatures, and more.
The majority of the affected applications were Web-based, but mobile apps also accounted for a significant percentage.
Developers are adding a lot of crypto to their code, especially in sectors like health care and financial services, but they’re doing it poorly, said Veracode CTO Chris Wysopal.
Many organizations need to use encryption because of data protection regulations, but the report suggests their developers don’t have the necessary training to implement it properly. “It goes to show how hard it is to implement cryptography correctly,” Wysopal said. “It’s sort of an endemic issue that a lot of people don’t think about.”
Many developers believe they know how to implement crypto, but they haven’t had any specific training in cryptography and have a false sense of security, he said. Therefore, even though they end up with applications where encryption is present, so they can tick that checkbox, attackers are still able to get at sensitive data.
And that doesn’t even touch on cases where developers decide to create their own crypto algorithms, a bad idea that’s almost always destined to fail. Veracode only tested implementations that used standard cryptographic APIs (application programming interfaces) offered by programming languages like Java and .NET or popular libraries like OpenSSL.
Programming languages like Java and .NET try to protect developers from making errors more than older languages like C, said Carsten Eiram, the chief research officer at vulnerability intelligence firm Risk Based Security, via email.
“However, many people argue that since modern languages are easier to program in and protect programmers more from making mistakes, more of them may be lulled into a false sense of security and not show proper care when coding, i.e. increasing the risk of introducing other types of problems like design and logic errors. Not implementing crypto properly would fall into that category,” Eiram said.
Too many programmers think that they can just link to a crypto library and they’re done, but cryptography is hard to implement robustly if you don’t understand the finer aspects of it, like checking certificates properly, protecting the encryption keys, using appropriate key sizes or using strong pseudo-random number generators.
“All this ultimately comes down to better education of programmers to understand all the pitfalls when implementing strong crypto,” Eiram said.
But it’s not only the developers’ fault. Matthew Green, a professor of cryptography engineering at Johns Hopkins University in Baltimore, thinks that many crypto libraries are “downright bad” from a usability perspective because they’ve been designed by and for cryptographers. “Forcing developers to use them is like expecting someone to fly an airplane when all they have is a driver’s license,” he said via email.
Green believes that making cryptographic software easier to use — ideally invisible so that people don’t even have to think about it — would be a much more efficient approach than training developers to be cryptographers.
“We don’t expect developers to re-implement TCP [a core Internet protocol] or the entire file system every time they write something,” he said. “The fact that current crypto APIs are so bad is just a reflection of the fact that crypto, and security in general, are less mature than those other technologies.”
The authors of some cryptographic libraries are aware that their creations should be easier to use. For example, the OpenSSL project’s roadmap, published last June, lists reducing API complexity and improving documentation as goals to be reached within one year. While not disputing that some crypto libraries are overly complex, Eiram doesn’t agree that developers need to be cryptographers in order to implement crypto correctly.
The crypto APIs in Java and .NET — the programming languages most used by the apps covered in Veracode’s report — were designed specifically for developers and provide most of what they need in terms of crypto features when developing applications in those languages, Eiram said.
“While it’s always preferable that libraries including crypto libraries are made to be used as easily as possible, the programmers using them ultimately need to at least understand on a high level how they work,” he said. “I really see it as a two-way street: Make crypto as easy to use as possible, but programmers having to implement crypto in applications should also properly educate themselves instead of hoping for someone to hold their hand.”
In addition to the lack of crypto expertise among developers and the complexity of some crypto libraries, forgetting to turn security features back on after product testing is another common source of failures, according to Green. For example, developers will often turn off TLS certificate validation in their testing environments because they don’t have a valid certificate installed on their test servers, but then forget to turn it back on when the product moves into production.
“There was a paper a couple of years back that found a huge percentage of Android applications were making mistakes like this, due to a combination of interface confusion and testing mistakes,” Green said.
The failure to properly validate TLS certificates was commonly observed by Veracode during their application security tests, according to Wysopal, and the CERT Coordination Center at Carnegie Mellon University has found that a lot of Android applications have the same problem.
Over the past few years there’s been a strong push to build encryption both into consumer applications, in response to revelations of mass Internet surveillance by intelligence agencies, and into enterprise software, in response to the increasing number of data breaches. But while everyone, from the general public to the government, seems to agree that encryption is important and we should have more of it, little attention is being paid to how it’s actually implemented into products.
If the situation doesn’t improve, we risk ending up with a false sense of security. We’ll have encryption built into everything, but it will be broken and our sensitive data will still be vulnerable to spies and would-be thieves.