Western Digital’s hard drive encryption is useless. Totally useless

The encryption systems used in Western Digital’s portable hard drives are pretty pointless, according to new research.

WD’s My Passport boxes automatically encrypt data as it is written to disk and decrypt the data as it is read back to the computer. The devices use 256-bit AES encryption, and can be password-protected: giving the correct password enables the data to be successfully accessed.

Now, a trio of infosec folks – Gunnar Alendal, Christian Kison and “modg” – have tried out six models in the WD My Passport family, and found blunders in the software designs.

For example, on some models, the drive’s encryption key can be trivially brute-forced, which is bad news if someone steals the drive: decrypting it is child’s play. And the firmware on some devices can be easily altered, allowing an attacker to silently compromise the drive and its file systems.

“We developed several different attacks to recover user data from these password-protected and fully encrypted external hard disks,” the trio’s paper [PDF] [slides PDF] states.

“In addition to this, other security threats are discovered, such as easy modification of firmware and on-board software that is executed on the user’s PC, facilitating evil maid and badUSB attack scenarios, logging user credentials, and spreading of malicious code.”

My Passport models using a JMicron JMS538S micro-controller have a pseudorandom number generator that is not cryptographically safe, and only cycles through a series of 255 32-bit values. This generator is used to create the data encryption key, and the drive firmware leaks enough information about the random number generator for this key to be recreated by brute-force, we’re told.

“An attacker can regenerate any DEK [data encryption key] generated from this vulnerable setup with a worst-case complexity of close to 240,” the paper states.

“Once the DEK [data encryption key] is recovered, an attacker can read and decrypt any raw disk sector, revealing decrypted user data. Note that this attack does not need, nor reveals, the user password.”

Drive models using a JMicron JMS569 controller – which is present in newer My Passport products – can be forcibly unlocked using commercial forensic tools that access the unencrypted system area of the drive, we’re told.

Drives using a Symwave 6316 controller store their encryption keys on the disk, encrypted with a known hardcoded AES-256 key stored in the firmware, so recovery of the data is trivial.

Western Digital's hard drive encryption is useless. Totally useless

Meanwhile, Western Digital says it is on the case.

“WD has been in a dialogue with independent security researchers relating to their security observations in certain models of our My Passport hard drives,” spokeswoman Heather Skinner told The Register in a statement.

“We continue to evaluate the observations. We highly value and encourage this kind of responsible community engagement because it ultimately benefits our customers by making our products better. We encourage all security researchers to responsibly report potential security vulnerabilities or concerns to WD Customer Service.

The NSA may have been able to crack so much encryption thanks to a simple mistake

The NSA may have been able to crack so much encryption thanks to a simple mistake

The NSA could have gained a significant amount of its access to the world’s encrypted communications thanks to the high-tech version of reusing passwords, according to a report from two US academics.

Computer scientists J Alex Halderman and Nadia Heninger argue that a common mistake made with a regularly used encryption protocol leaves much encrypted traffic open to eavesdropping from a well-resourced and determined attacker such as the US national security agency.

The information about the NSA leaked by Edward Snowden in the summer of 2013 revealed that the NSA broke one sort of encrypted communication, virtual private networks (VPN), by intercepting connections and passing some data to the agency’s supercomputers, which would then return the key shortly after. Until now, it was not known what those supercomputers might be doing, or how they could be returning a valid key so quickly, when attacking VPN head-on should take centuries, even with the fastest computers.

The researchers say the flaw exists in the way much encryption software applies an algorithm called Diffie-Hellman key exchange, which lets two parties efficiently communicate through encrypted channels.

A form of public key cryptography, Diffie-Hellman lets users communicate by swapping “keys” and running them through an algorithm which results in a secret key that both users know, but no-one else can guess. All the future communications between the pair are then encrypted using that secret key, and would take hundreds or thousands of years to decrypt directly.

But the researchers say an attacker may not need to target it directly. Instead, the flaw lies in the exchange at the start of the process. Each person generates a public key – which they tell to their interlocutor – and a private key, which they keep secret. But they also generate a common public key, a (very) large prime number which is agreed upon at the start of the process.

The NSA may have been able to crack so much encryption thanks to a simple mistake

Since those prime numbers are public anyway, and since it is computationally expensive to generate new ones, many encryption systems reuse them to save effort. In fact, the researchers note, one single prime is used to encrypt two-thirds of all VPNs and a quarter of SSH servers globally, two major security protocols used by a number of businesses. A second is used to encrypt “nearly 20% of the top million HTTPS websites”.

The problem is that, while there’s no need to keep the chosen prime number secret, once a given proportion of conversations are using it as the basis of their encryption, it becomes an appealing target. And it turns out that, with enough money and time, those commonly used primes can become a weak point through which encrypted communications can be attacked.

In their paper, the two researchers, along with a further 12 co-authors, describe their process: a single, extremely computationally intensive “pre-calculation” which “cracks” the chosen prime, letting them break communications encrypted using it in a matter of minutes.

How intensive? For “shorter” primes (512 bits long, about 150 decimal digits), the precalcuation takes around a week – crippling enough that, after it was disclosed with the catchy name of “Logjam”, major browsers were changed to reject shorter primes in their entirety. But even for the gold standard of the protocol, using a 1024-bit prime, a precalculation is possible, for a price.

The researchers write that “it would cost a few hundred million dollars to build a machine, based on special purpose hardware, that would be able to crack one Diffie-Hellman prime every year.”

The NSA may have been able to crack so much encryption thanks to a simple mistake

“Based on the evidence we have, we can’t prove for certain that NSA is doing this. However, our proposed Diffie-Hellman break fits the known technical details about their large-scale decryption capabilities better than any competing explanation.”

There are ways around the problem. Simply using a unique common prime for each connection, or even for each application, would likely reduce the reward for the year-long computation time so that it was uneconomical to do so. Similarly, switching to a newer cryptography standard (“elliptic curve cryptography”, which uses the properties of a particular type of algebraic curve instead of large prime numbers to encrypt connections) would render the attack ineffective.

But that’s unlikely to happen fast. Some occurrences of Diffie-Hellman literally hard-code the prime in, making it difficult to change overnight. As a result, “it will be many years before the problems go away, even given existing security recommendations and our new findings”.

“In the meantime, other large governments potentially can implement similar attacks, if they haven’t already.”

The next steps for the White House on encryption

The next steps for the White House on encryption

THE OBAMA administration’s decision not to seek legislation requiring technology companies to give law enforcement access to encrypted communications on smartphones has a certain logic. In this age of hacking and cyberintrusion, encryption can keep most people safer. But the decision also carries risks. Encryption can give a tiny band of criminals and terrorists a safe haven. The United States must now make the most of the useful side of encryption, but without losing sight of the risks.

FBI Director James B. Comey warned last year that law enforcement might be “going dark” because technology companies, including Apple and Google, are introducing ways for users to send encrypted messages by smartphones that can be unlocked only by the users, not by the companies. Mr. Comey was alarmed this would give criminals and terrorists a place to communicate that was beyond reach even of law enforcement with a court order. Mr. Comey suggested Congress require tech companies to provide what is known as extraordinary access to encrypted information, a “lawful intercept” capability, sometimes referred to as a backdoor, or a special key for the government. We sympathized with Mr. Comey’s appeal and urged all sides to look for a compromise.

No compromise was forthcoming. The reaction to Mr. Comey’s suggestion in the technology world was a strong protest that any weakening of encryption — even a tiny bit, for a good reason — creates a vulnerability for all. The firms also made the argument that encryption can be a positive force in today’s chaotic world of cyberattacks; their customers want absolute privacy, too, for the digital lives held on the smartphones in their pockets. They also pointed out that if backdoor access is granted to the U.S. government, it will provide cover for authoritarian governments such as China and Russia to demand the same or worse.

Mr. Comey said last week that private talks with the tech companies have been “increasingly productive.” That is promising. There are methods the FBI might use to crack encryption case by case or to find the information elsewhere. The FBI and state and local law enforcement are most in need; the National Security Agency has much stronger tools for breaking encryption overseas.

Having stood up to Mr. Comey, Silicon Valley should demonstrate the same fortitude when it comes to China and Russia and absolutely refuse to allow intrusions by these and other police states. It would help, too, if President Obama articulated the principle loud and clear.

That leaves a nagging worry. The United States is a rule-of-law nation, and encryption technology is creating a space that is in some ways beyond the reach of the law. Encryption may indeed be valuable to society if it protects the majority. But what if it enables or protects the 1 percent who are engaged in criminality or terrorism? That threat has to be taken into account, and so far it remains unresolved. It will not go away.

Aadhaar encryption protects privacy, will take eons to crack

The Aadhaar system’s data collection and storage is strongly protected by sophisticated encryption processes to ensure biometric data does not leak either through private contractors running enrollment centres or at the central data servers that store the details.

The unique identity authority of India’s processes are intended to allay fears that biometric data collected by private contractors might be vulnerable to falling in unauthorized hands as the biometric detail is encrypted using the highest available public key cryptography encryption.

Even if the data is stolen or lost the encryption prevents access to the biometrics as it will require the most powerful computers literally eons to crack the code. Similarly at the central data centre, the encryption processes are repeated while storing the details, making attempts to access and use the data very difficult.

The government hopes that the lack of human interface in storing the data and procedures such as data collectors being required to authenticate every entry though their own biometric verification will help convince the Supreme Court that privacy concerns have been addressed by the UIDAI.

The UIDAI programme’s success is indicated by lack of any credible complaints or proof of misuse of data since it started the ambitious scheme almost five year ago. This is partly due to the processes that make even loss of a recording machine or copying on a flash drive a futile exercise.

The data are being collected on software-Enrollment Client (EC) Software-written, maintained and provided by the UIDAI and is encrypted to prevent leaks at the enrollment centres managed by private vendors and in transit. The private agencies on ground use the EC Software which ensures that only authentic and approved person can sign-in for the purpose of enrolling people.

The enrollment client software used by private vendors strongly encrypts individual electronic files containing demographic and biometric details (enrollment data packets) of residents at the time of enrollment and even before the data is saved in any hard disk.

The encryption uses highest available public key cryptography encryption (PKI-2048 and AES-256) with each data record having a built-in mechanism to detect any tampering.

The e-data packages are always stored on disk in PKI encrypted form and is never decrypted or modified during transit making it completely inaccessible to any system or person.

Among other security measures, UIDAI has ensured that the Aadhaar database is not linked to any other databases., or to information held in other databases and its only purpose is to verify a person’s identity at the point of receiving a service, and that too with the consent of the Aadhaar number holder.

Encrypted Smartphones Challenge Investigators

Encrypted Smartphones Challenge Investigators

Law-enforcement officials are running up against a new hurdle in their investigations: the encrypted smartphone.

Officials say they have been unable to unlock the phones of two homicide victims in recent months, hindering their ability to learn whom those victims contacted in their final hours. Even more common, say prosecutors from New York, Boston and elsewhere, are locked phones owned by suspects, who refuse to turn over passcodes.

Manhattan District Attorney Cyrus Vance says his office had 101 iPhones that it couldn’t access as of the end of August, the latest data available.

The disclosures are the latest twist in a continuing dispute between law-enforcement officials and Apple Inc. and Google Inc., after the two tech companies released software last year that encrypted more data on new smartphones. The clash highlights the challenge of balancing the privacy of phone users with law enforcement’s ability to solve crimes.

“Law enforcement is already feeling the effects of these changes,” Hillar Moore, the district attorney in Baton Rouge, La., wrote to the Senate Judiciary Committee in July. Mr. Moore is investigating a homicide where the victim’s phone is locked. He is one of 16 prosecutors to send letters to the committee calling for back doors into encrypted devices for law enforcement.

The comments are significant because, until now, the debate over encrypted smartphones has been carried by federal officials. But local police and prosecutors handle the overwhelming share of crimes in the U.S., and district attorneys say encryption gives bad guys an edge.

Encrypted phones belonging to victims further complicate the issue, because some families want investigators to have access to the phones.

“Even if people are not terribly sympathetic to law-enforcement arguments, this situation might cause them to think differently,” said Paul Ohm, a Georgetown University Law Center professor and former prosecutor.

Last week, Federal Bureau of Investigation Director James Comey told a Senate hearing that the administration doesn’t want Congress to force companies to rewrite their encryption code. “The administration is not seeking legislation at this time,” White House National Security Council spokesman Mark Stroh said in a written statement Monday.

Some independent experts say the handful of cases that have emerged so far isn’t enough to prove that phone encryption has altered the balance between law enforcement and privacy. In many cases, they say, investigators can obtain the encrypted information elsewhere, from telephone companies, or because the data was backed up on corporate computers.

—————————————————————————————————————————————————–

In the past this would have been easy for us. We would have an avenue for this information, we’d get a subpoena, obtain a record, further our investigation.

—Evanston Police Commander Joseph Dugan

—————————————————————————————————————————————————–
“It depends on what the success rate is of getting around this technology,” said Orin Kerr, a George Washington Law professor.

Apple encrypted phones by default beginning with iOS 8, the version of its mobile-operating system released last fall. The decision came amid public pressure following former national-security contractor Edward Snowden’s revelations of tech-company cooperation with government surveillance.

With iOS 8, and the newly released iOS 9, Apple says it cannot unlock a device with a passcode. That means Apple cannot provide information to the government on users’ text messages, photos, contacts and phone calls that don’t go over a telephone network. Data that isn’t backed up elsewhere is accessible only on the password-protected phone.

“We have the greatest respect for law enforcement and by following the appropriate legal process, we provide the relevant information we have available to help,” Apple wrote in a statement to The Wall Street Journal.

Apple Chief Executive Tim Cook is an advocate of encryption. “Let me be crystal clear: Weakening encryption, or taking it away, harms good people that are using it for the right reasons,” he said at a conference earlier this year.

Only some phones, such as the Nexus 6 and the Nexus 9, running Google’s Android Lollipop system are encrypted by default. Google declined to comment about the role of encryption in police investigations.

Three of the 16 district attorneys who wrote to the Senate—from Boston, Baton Rouge and Brooklyn—told the Journal they were aware of cases where encrypted phones had hindered investigations. Investigators in Manhattan and Cook County in Illinois also have cases dealing with encrypted phones. Investigators say, however, they have no way of knowing whether or not the locked phones contain valuable evidence.

Mr. Moore, of Baton Rouge, thinks there might be important information on a victim’s phone. But he can’t access it.

Brittany Mills of Baton Rouge used her iPhone 5s for everything from sending iMessages to writing a diary, and she didn’t own a computer, her mother said. Ms. Mills, a 28-year-old patient caregiver, was shot to death at her door in April when she was eight months pregnant.

Police submitted a device and account information subpoena to Apple, which responded that it couldn’t access anything from the device because it was running iOS 8.2. Mr. Moore thinks the iCloud data Apple turned over won’t be helpful because the most recent backup was in February, two months before her death. The records he obtained of her phone calls yielded nothing.

“When something as horrible as this happens to a person, there should be no roadblock in the way for law enforcement to get in there and catch the person as quickly as possible,” said Barbara Mills, Brittany Mills’s mother.

Investigators in Evanston, Ill., are equally stumped by the death of Ray C. Owens, 27. Mr. Owens was found shot to death in June with two phones police say belonged to him, an encrypted iPhone 6 and a Samsung Galaxy S6 running Android. A police spokesman said the Samsung phone is at a forensics lab, where they are trying to determine if it is encrypted.

The records that police obtained from Apple and service providers had no useful information, he added. Now the investigation is at a standstill.

“In the past this would have been easy for us,” said Evanston Police Commander Joseph Dugan. “We would have an avenue for this information, we’d get a subpoena, obtain a record, further our investigation.”

Barbara Mills is committed to making sure more families don’t have to see cases go unsolved because of phone encryption. “Any time you have a situation of this magnitude, if you can’t depend on law enforcement, who can you depend on?”

Risk Analysis, Encryption Stressed in HITECH Act Final Rules

Risk Analysis, Encryption Stressed in HITECH Act Final Rules

Two final rules for the HITECH electronic health record incentive program strongly emphasize the value of risk assessments and encryption as measures for safeguarding patient information.

A new rule establishing requirements for proving a provider is a “meaningful user” for Stage 3 of the incentive program requires protecting patient data through the implementation of appropriate technical, administrative and physical safeguards and conducting a risk analysis that includes assessing encryption of ePHI created or maintained by a certified electronic health record.

A companion final rule setting 2015 standards for certifying EHR software as qualifying for the program requires the software to be capable of creating a hashing algorithm with security strength equal to or greater than SHA-2.

The Department of Health and Human Services’ Centers for Medicare and Medicaid Services says the Stage 3 requirements are optional in 2017. Providers who choose to begin Stage 3 in 2017 will have a 90-day reporting period. However, all providers will be required to comply with Stage 3 requirements beginning in 2018 using EHR technology certified to the 2015 Edition requirements.

When it comes to privacy and security requirements included in the final rules, versus what was in the proposed rules, there were “no significant changes, no surprises,” says John Halamka, CIO of Beth Israel Deaconess Medical Center.

Some privacy and security experts, however, point out the rules spotlight the importance of safeguarding electronic protected health information through measures such as risk analysis, encryption and secure data exchange. But some observers criticize HHS for not offering more detailed guidance on risk assessments.

Risk Analysis

While conducting a risk analysis was also a requirement in Stages 1 and 2 of the meaningful use program, the final rule for Stage 3 requires that healthcare providers drill down further by “conducting or reviewing a security risk analysis … including addressing the security – to include encryption – of electronic protected health information created or maintained by certified electronic health record technology … and implement security updates as necessary and correct identified security deficiencies.”

The objective of that requirement is to protect electronic health information through the implementation of “appropriate technical, administrative and physical safeguards,” the rule states. Rulemakers stress assessing the data created or maintained by an electronic health record system, versus conducting a more comprehensive security risk assessment as required under the HIPAA Security Rule.

“Although [HHS’] Office for Civil Rights does oversee the implementation of the HIPAA Security Rule and the protection of patient health information, we believe it is important and necessary for a provider to attest to the specific actions required to protect ePHI created or maintained by CEHRT in order to meet the EHR incentive program requirements,” the rule notes. “In fact, in our audits of providers who attested to the requirements of the EHR Incentive Program, this objective and measure are failed more frequently than any other requirement.

“This objective and measure are only relevant for meaningful use and this program, and are not intended to supersede what is separately required under HIPAA and other rulemaking. We do believe it is crucial that all [eligible healthcare providers] evaluate the impact CEHRT has on their compliance with HIPAA and the protection of health information in general.”

New to the risk analysis requirement is the addition of assessing administrative and technical safeguards. “This measure enables providers to implement risk management security measures to reduce the risks and vulnerabilities identified. Administrative safeguards – for example, risk analysis, risk management, training and contingency plans – and physical safeguards – for example, facility access controls, workstation security – are also required to protect against threats and impermissible uses or disclosures to ePHI created or maintained by CEHRT.”

Missed Opportunity?

HHS should have used the final rule to offer even more helpful guidance about risk assessments, says privacy attorney David Holtzman, vice president of compliance at the security consulting firm CynergisTek.

“CMS focused significant attention to the role of risk analysis in safeguarding the privacy and security of health information created or maintained in an EHR,” he says. “However, they missed an important opportunity to … ensure that administrative and physical safeguards requirements of the HIPAA Security Rule are assessed in any security risk analysis.”

To guide healthcare providers, including smaller doctors’ offices, in conducting the Stage 3 risk analysis, the rule makes note of free tools and resources available to assist providers, including a Security Risk Assessment Tool developed by ONC and OCR.

But the use of that tool is daunting for some smaller healthcare entities, contends Keith Fricke, principal consultant at consulting firm tw-Security.

“The SRA tool is too overbearing for any organization to use, let alone small healthcare organizations, including small provider offices,” he says.

Secure Data Exchange

Besides a renewed focus on risk analysis, other privacy and security related enhancements to the meaningful use Stage 3 final rule include an emphasis on encryption and secure messaging.

“More than half of the objectives in Stage 3 starting in 2017 require EHRs to have interoperable exchange technology that is encrypted and offered to relying parties with strong identity assurance,” said David Kibbe, M.D., CEO of DirectTrust, which created and maintains a framework for secure e-mail in the healthcare sector.

“DirectTrust’s work can and will be relied upon for multiple Stage 2 and 3 objectives and criteria announced by CMS in the new rule,” he says.

For instance, secure electronic messaging to communicate with patients on relevant health information is an objective in Stage 3, with a series of measurements.

Software Certification Rule

While privacy and security are weaved through the final rule for Stage 3 of the meaningful use program for healthcare providers, HHS’ Office of the National Coordinator for Health IT also raised the bar on requirements in the final rule for 2015 Edition health IT software certification. That includes phasing in requirements for more robust encryption.

“Given that the National Institute of Standards and Technology, technology companies, and health IT developers are moving away from SHA-1, we believe now is the appropriate time to move toward the more secure SHA-2 standard,” ONC wrote in its rulemaking.

The rule also states: “We note that there is no requirement obligating health IT developers to get their products certified to this requirement immediately, and we would expect health IT developers to not begin seeking certification to this criterion until later in 2016 for implementation in 2017 and 2018. We further note that certification only ensures that a health IT module can create hashes using SHA-2; it does not require the use of SHA-2. For example, users of certified health IT may find it appropriate to continue to use SHA-1 for backwards compatibility if their security risk analysis justifies the risk.”

Some other safeguard features, such as data segmentation for privacy of sensitive health information, are included in the software certification rule as optional, Halamka notes. “That’s appropriate for immature standards,” he says.

Public Input

CMS is continuing to seek public comment on the “meaningful use” rule for 60 days. This input could be considered by CMS for future policy developments for the EHR incentive program, as well as other government programs, the agency says.

However, this additional public comment period could become problematic, Holtzman contends. “The adoption of the changes in the objective and measures as a ‘final rule with comment’ could cause delays in EHR vendors and developers in producing upgrades to their technology. The uncertainty in that CMS could make further changes in the months ahead might encourage these industry partners to hold off in their production process.”

National Encryption Policy: Not just privacy, but also feasibility and security are at risk

National Encryption Policy: Not just privacy, but also feasibility and security are at risk

Encryption is an important aspect which governs not just the communications but also the storage. When data is in motion there are some methods/ protocols which facilitate end-to-end encryption:

1. VPN

2. Remote Server Connectivity viz. RDP, SSH

3. Internet based Voice/ Messaging Communications

4. email communication

5. Communications between Wearables and their Host devices

6. Web-Services providing encryption services viz. Etherpad, Gist

However, when it concerns data at rest ie. data stored on the disk, there are numerous scenarios which fall under the purview of encryption:

1. On the Fly Disk Encryption which may also include the entire OS

2. Password protection of files

3. email Message Encryption

4. Full disk-encryption by Smartphones

Recently, Government of India released its version of Draft for National Encryption Policy and within 24 hours of releasing it, they have withdrawn it, however with a promise the policy will be re-drafted and re-released.

In these 24 hours, all those involved in IT security of the Indian Internet Security forum took up the cause of protecting user privacy, reprimanding the government for ill conceived draft of National Encryption Policy. Their efforts resulted in forcing the government to revoke the draft proposal and contemplate on a better proposal.

According to the draft, B2B/ B2C and C2B, sector shall use encryption algorithms and key sizes as prescribed by the government, moreover, according to the draft:

“On demand, the user shall be able to reproduce the same Plain text and encrypted text pairs using the software/ hardware used to produce the encrypted text from the given plain text. Such plain text information shall be stored by the user/ organization/ agency for 90 days from the date of transaction and made available to Law Enforcement Agencies as and when demanded in line with the provisions of the laws of the country.”

Furthermore, the draft also issued guidelines for communication with foreign entity, “the primary responsibility of providing readable plain-text along with the corresponding Encrypted information shall rest on entity (B or C) located in India.”

The draft policy requires service providers whether irrespective of their country of origin to enter into an agreement with the Government of India and the consumers of these services (Government/ Business/ Citizens) are expected to provide the pain-text/encrypted datasets.

The question is not why, but how would it be technically feasible for a customer to maintain this information, given the fact that encryption was used to secure the data from rogue entities. Storing anything in plain-text for any amount of period, defeats the entire purpose of using encryption except with a solace that the channel used for transmission of data is secured. The draft has set very high and impossible to achieve expectations from every citizen and organization, irrespective of their field of expertise to have knowledge about the internal working of these third party applications, also at the same time they are expected to have knowledge about maintaining the two different data-sets.

Furthermore, the draft also requires anything that has been encrypted by an individual be it his personal documents or communication between two individuals, which interestingly is considered to be a private affair by the rest of the world, to be made available for scrutiny as and when demanded.

Expecting a consumer of various services, irrespective of the fact whether the consumer is an organization or an individual, to understand the internal functionality of each and every service / software and take a conscious decision of maintaining the two separate data-sets is simply not feasible and virtually impossible.

Even though a clarification was issued by the government that

The mass use encryption products, which are currently being used in web applications, social media sites, and social media applications such as Whatsapp, Facebook, Twitter etc.
SSL/TLS encryption products being used in Internet-banking and payment gateways as directed by the Reserve Bank of India
SSL/TLS encryption products being used for e-commerce and password based transactions.

It still raises quite a few eyebrows especially about the intention of the drafting of this National Encryption Policy. Not just the privacy, but also the feasibility and the security are at risk.

The argument until now was about data which resides on your disk, and using these very standards what can we say about the encrypted communication channels/ services? One word summarizes it all “Impossible”. Over the network encryption like VPN/ SSH or to put it simply cloud based services be it of any-type, which lately have made inroads into our lives would be rendered useless and their very existence in India is at risk, not just because it would have been mandatory for all of them to enter into an agreement with the Government of India, but the consumers of these services will also have to maintain a separate copy of the content.

Applications and Service providers who provide Secure Messaging ie, encrypting the voice channels or self-destructing messages, in order to provide better privacy and discourage eavesdropping, would in all probability get banned or might have to remove these features so as to cater to Indian audience. Over and above, how do the policy-makers expect the consumers to comply?

What happens when a person from a different country uses these services in India? Wouldn’t this person be violating the Indian Law and in all probability be considered a criminal?

The draft also requires all the stakeholders to use Symmetric Cryptographic/Encryption products with AES, Triple DES and RC4 encryption algorithms and key sizes up to 256 bits.

Way back in 2011 when Microsoft Researchers discovered a way to break AES based encryption, Triple DES is considered weak, while RC4 is simply not acceptable as an encryption algorithm to any organization. These are age-old encryption algorithms and are never/rarely considered when organizations are drawing up their own encryption policies.

In this age of competition, organizations have their own trade secrets to be guarded, not just from competitors but also from rogue governments. A weakened encryption schema and mandatory storage of encrypted data in its plain text form is nothing less than committing a Harakiri for these organizations. Moreover, by way of an agreement that draft expects the software/ hardware vendors to comply with these encryption restrictions, thereby weakening the overall security of India’s IT infrastructure.

National Encryption Policy should be about setting up of minimum encryption standards for data protection, penalization organizations and institutions for not implementing high encryption standards and protecting the data from pilferage and leakage.

Encryption policy has always had a direct impact on the privacy of an individual and when it used by corporations/ organization, it affects their business/ trade secrets; hence Government should also consider thinking about the various means and ways of implementing/ strengthening the non-existent privacy laws.

As we have been promised that the policy would be re-drafted, let us keep our fingers crossed and hope that better sense prevails.

National Encryption Policy: Government Issues Clarification on WhatsApp, Social Media

National Encryption Policy: Government Issues Clarification on WhatsApp, Social Media

The government issued an addendum to clarify that “mass use encryption products, which are currently being used in web applications, social media sites, and social media applications such as WhatsApp, Facebook, Twitter etc.” While that language is vague in itself, you can rest easy without needing to worry about having to store your WhatsApp messages for 90 days. The original text continues below.

The DeitY has posted a draft National Encryption Policy on its website inviting comments from the public on its mission, strategies, objectives, and regulatory framework, which you can send to akrishnan@deity.gov.in, until 16th October 2015. A lot of the details mentioned in the draft guidelines are worrying, and this is a topic that concerns every consumer.

While the draft encryption policy’s preamble starts by talking about improving e-governance and e-commerce through better security and privacy measures, it very quickly brings up national security as well, and that’s where things get worrying from a consumer’s perspective. It’s very reminiscent of when the Indian government was thinking about banning BBM in India unless BlackBerry (then Research in Motion) gave security agencies access to snoop on emails. The two would eventually reach an arrangement that allowed the government to intercept email.

The language of the new draft policy is quite clear on one thing – businesses and consumers may use encryption for storage and communication, but the encryption algorithms and key sizes will be prescribed by the Indian government. What’s more, vendors of encryption products would have to register in India (with the exception of mass use products, such as SSL), and citizens are allowed to use only the products registered in India.

“Would OpenPGP, a commonly-used standard for encryption of email, fall under ‘mass use’?” asks Pranesh Prakash, Policy Director at the Centre for Internet and Society, speaking to Gadgets 360. “Because if it doesn’t, I am prohibited from using it. But if it does, I am required to copy-paste all my encrypted mails into a separate document to store it in plain text, as required by the draft policy. Is that what it really intends? Has the government thought this through?”

National Encryption Policy: Government Issues Clarification on WhatsApp, Social Media

Most people don’t explicitly use encryption, but it’s built into apps they use every day. Do the draft guidelines also extend to products and services with built-in encryption like WhatsApp? If yes – and the language certainly suggests it does – then combine them with governments requirements for its citizens, as proposed in the draft guidelines, and we could have very worrying scenarios.

The draft guidelines read “All citizens (C), including personnel of Government/ Business (G/B) performing non-official/ personal functions, are required to store the plaintexts of the corresponding encrypted information for 90 days from the date of transaction and provide the verifiable Plain Text to Law and Enforcement Agencies as and when required as per the provision of the laws of the country.”

WhatsApp messages are now encrypted end-to-end. So do the draft guidelines mean you have to store a copy of all your WhatsApp messages for 90 days? What about Snapchat? Or any other form of ephemeral messaging that is automatically deleted after being read? The consumer is expected to maintain plain text copies of all communications for 90 days – so that these can be produced if required by the laws of the land – so, will it even legal to read a message that deletes itself, if and when the draft guidelines become law?

The draft policy document states that the vision is to create an information security environment, and secure transactions. But the actual details mentioned in the draft appear to do the opposite, and put a focus more on the lines of limiting encryption only to technologies that likely could be intercepted by the government, when required.

This is in many ways similar to the Telecom Regulatory Authority of India’s draft letter on Net Neutrality, which instead talked about issues like cyberbullying and ‘sexting’. In the feedback period, Trai received over 1 million emails. but the Department of Telecom report on Net Neutrality also went against public sentiment on certain things, suggesting that telcos should be allowed to charge extra for specific services, such as Skype or WhatsApp voice calls in India, showing that calls for feedback aren’t necessarily being taken seriously.

And, with the draft National Encryption Policy, another problem that is shared with the Net Neutrality discussions, is the use of vague language. The result is that there is very little clarity at this point on what will and will not be permitted by the government if the draft guidelines are adopted. We’re living in a time when the government talks about how WhatsApp and Gmail may be used by “anti-national elements”, and even considered requiring Twitter and Facebook to establish servers in India.

With that in mind, you have to ask, will it be even legal to use WhatsApp if these guidelines are implemented? After all, WhatsApp messages have end-to-end encryption and if this service does not register in India, and comply with the algorithms prescribed by the government, then as a citizen of India, you won’t be allowed to use it because “users in India are allowed to use only the products registered in India,” as per the draft guidelines.

These are questions that don’t just affect a few people, but just about every Indian who is using the mobile Internet. In its present form, the draft actually severely limits what you can do online, and could hobble the push for a digital India. There’s almost a full month to give our feedback, but is anyone listening?

Experts pick big holes in India’s encryption policy

India’s proposed encryption policy has come under heavy fire with internet experts and online activists alleging that it provides blanket backdoors to law enforcement agencies to access user data, which could be abused by hackers and spies.

Experts pick big holes in India's encryption policy

The Department of Electronics and Information Technology ( DeitY) has asked for public comments on the ‘Draft National Encryption Policy’ on its website until October 16. The stated mission of the policy on encryption -or, the practice of scrambling data to make it unintelligible for even the service providers -is to “provide confidentiality of information in cyber space for individuals, protection of sensitive or proprietary information for individuals & businesses, (and) ensuring continuing reliability and integrity of nationally critical information systems and networks”.

However, almost all the experts ET spoke to, while agreeing that a policy for encryption is a welcome move, felt that the policy document in its current form is not well thought-out and makes suggestions that could harm businesses and individuals, and thwart research and development in the field of encryption. The most contentious provision in the draft policy document is perhaps the one requiring businesses and individuals to keep a plain text copy of the data they encrypt for storage and communication, for 90 days, and make it available to law enforcement agencies “as and when demanded in line with the provisions of the laws of the country”.

“The mission of the policy is to promote national security and in crease confidentiality of information, but it specifically excludes `sensitive departmentsagencies’, which most need such protection.The content of the policy shows why they have been excluded: the policy, in fact, decreases security and confidentiality of information,” said Pranesh Prakash, policy director at the Centre for Internet and Society. “If our emails, for example, are required to be kept in plain text rather than in encrypted form, then that makes it easier for hackers and foreign agencies to spy on our government, businesses, and on all Indian citizens,” he said.

Raman Jit Chima, policy director at digital rights organisation Access, said that instead of promoting the use of encryption, the policy draft “appears to seek to heavily regulate encryption and the rules it proposes will likely impede its usage by Indian developers and startups”. “By trying to restrict and weaken the everyday usage of encryption in order to facilitate tapping demands, the everyday communications of all Indians will likely become less secure,” Chima said.

The policy seeks to promote R&D in the field of cryptography by public and private companies, government agencies and academia, but it requires all vendors of encryption products to register their products with the government and re-register when their products are upgraded.

Arun Mohan Sukumar, cyber initiative head at Observer Research Foundation, said, “The government has finally realised the need to protect its communications infrastructure from cyber intrusions. But creating a `license raj’ of encrypted products and services, as this draft policy aims to, will only stunt cyber security research.”

Obama edges toward full support for encryption

Obama edges toward full support for encryption

President Obama recently called on the best minds in government, the tech sector and academia to help develop a policy consensus around “strong encryption” — powerful technologies that can thwart hackers and provide a profound new level of cybersecurity, but also put data beyond the reach of court-approved subpoenas.

From Obama on down, government officials stressed that they are not asking the technology sector to build “back doors” that would allow law enforcement and intelligence agencies to obtain communications in the event of criminal or terrorist acts.

That prospect drew an extremely negative reaction from the techies — and is still chilling the government-industry dialogue over the issue.

Instead, the government is saying that tech and communications companies themselves should have some way to unlock encrypted messages if law enforcement shows up with a subpoena.

Access to such messages could, in theory, be vital in real-time crises. Skeptical lawmakers have said federal officials have offered no empirical data suggesting this has been a problem.

“One of the big issues … that we’re focused on, is this encryption issue,” Obama said during a Sept. 16 appearance before the Business Roundtable. “And there is a legitimate tension around this issue.”

Obama explained: “On the one hand, the stronger the encryption, the better we can potentially protect our data. And so there’s an argument that says we want to turbocharge our encryption so that nobody can crack it.”

But it wasn’t as simple as that.

“On the other hand,” Obama said, “if you have encryption that doesn’t have any way to get in there, we are now empowering ISIL, child pornographers, others to essentially be able to operate within a black box in ways that we’ve never experienced before during the telecommunications age. And I’m not talking, by the way, about some of the controversies around [National Security Agency surveillance]; I’m talking about the traditional FBI going to a judge, getting a warrant, showing probable cause, but still can’t get in.”

According to the president, law enforcement, the tech community and others are engaged in “a process … to see if we can square the circle here and reconcile the need for greater and greater encryption and the legitimate needs of national security and law enforcement.”

Obama summed up: “And I won’t say that we’ve cracked the code yet, but we’ve got some of the smartest folks not just in government but also in the private sector working together to try to resolve it. And what’s interesting is even in the private sector, even in the tech community, people are on different sides of this thing.”

However, the tech sector, writ large, has shown little interest in negotiating over strong encryption.

After a recent hearing of the House Intelligence Committee, Rep. Adam Schiff, D-Calif., said technology companies want the government to spell out what it wants, and that techies simply will not craft a policy in an area that should be free from government interference.

Tech companies are deeply concerned that American-made products will be seen in the global marketplace as tainted if they reach some kind of accommodation with the government. It’s all part of the continued international blowback from the revelations by ex-NSA contractor Edward Snowden, tech groups say.

Schiff visited with several Silicon Valley-based companies over the recent summer recess. “I was impressed by the companies’ position — it’s hard to refute. But what was unusual, more than one of the companies said government should provide its [proposed] answer in order to advance the discussion,” he said.

The tech sector, Schiff said, is unlikely to advance a policy position other than its opposition to any mandated “back door.”

“But there has to be some kind of resolution, even if it is acceptance of the status quo.”

Schiff and other lawmakers, including Senate Judiciary Chairman Charles Grassley, R-Iowa, are trying to encourage a dialogue between the tech sector and law enforcement.

FBI Director James Comey testified before the House Intelligence panel that such talks are underway, and have been productive so far.

“First of all, I very much appreciate the feedback from the companies,” Comey said at the Sept. 10 Intelligence Committee hearing. “We’ve been trying to engage in dialogue with companies, because this is not a problem that’s going to be solved by the government alone; it’s going to require industry, academia, associations of all kinds and the government.”

He stressed: “I hope we can start from a place we all agree there’s a problem and that we share the same values around that problem. … We all care about safety and security on the Internet, right? I’m a big fan of strong encryption. We all care about public safety.”

It was an extremely complicated policy problem, Comey agreed, but added, “I don’t think we’ve really tried. I also don’t think there’s an ‘it’ to the solution. I would imagine there might be many, many solutions depending upon whether you’re an enormous company in this business, or a tiny company in that business. I just think we haven’t given it the shot it deserves, which is why I welcome the dialogue. And we’re having some very healthy discussions.”

Tech sources contacted after the hearing suggested that Comey was overstating the level of dialogue now taking place.

The Obama administration has signaled that it isn’t looking for a legislative solution, which is just as well, because lawmakers including Schiff and Grassley have said that is a highly unlikely prospect.

But the administration probably needs to give a clearer signal of what it would like to see at the end of this dialogue before the tech side agrees to fully engage.