American ISIS Recruits Down, but Encryption Is Helping Terrorists’Online Efforts, Says FBI Director

American ISIS Recruits Down, but Encryption Is Helping Terrorists'Online Efforts, Says FBI Director

The number of Americans traveling to the Middle East to fight alongside Islamic State has dropped, but the terrorist group’s efforts to radicalize people online is getting a major boost from encryption technology, FBI Director James Comey said Wednesday.

Since August, just one American a month has traveled or attempted to travel to the Middle East to join the group, compared with around six to 10 a month in the preceding year and a half, Mr. Comey told reporters in a round table meeting at FBI headquarters.

However, federal authorities have their hands full trying to counter Islamic State’s social media appeal. Of around 1,000 open FBI investigations into people who may have been radicalized across the U.S., about 80% are related to Islamic State, Mr. Comey said.

The increasing use of encrypted communications is complicating law enforcement’s efforts to protect national security, said Mr. Comey, calling the technology a “huge feature of terrorist tradecraft.”

The FBI director cited Facebook Inc.’s WhatsApp texting service, which last month launched end-to-end encryption in which only the sender and receiver are able to read the contents of messages.

“WhatsApp has over a billion customers—overwhelmingly good people but in that billion customers are terrorists and criminals,” Mr. Comey said. He predicted an inevitable “collision” between law enforcement and technology companies offering such services.

Silicon Valley leaders argue that stronger encryption is necessary to protect consumers from a variety of threats.

“While we recognize the important work of law enforcement in keeping people safe, efforts to weaken encryption risk exposing people’s information to abuse from cybercriminals, hackers and rogue states,” WhatsApp CEO Jan Koum wrote last month in a blog post accompanying the rollout of the stronger encryption technology. The company Wednesday declined to comment on Mr. Comey’s remarks.

The FBI also continues to face major challenges in unlocking phones used by criminals including terrorists, Mr. Comey said. Investigators have been unable to unlock around 500 of the 4,000 or so devices the FBI has examined in the first six month of this fiscal year, which began Oct. 1, he said.

“I expect that number just to grow as the prevalence of the technology grows with newer models,” Mr. Comey added.

A terrorist’s locked iPhone recently sparked a high-stakes legal battle between the Justice Department and Apple Inc.
After Syed Rizwan Farook and his wife killed 14 people and wounded 22 in a December shooting rampage in San Bernardino, Calif., FBI agents couldn’t unlock the phone of Mr. Farook—who, along with his wife, was killed later that day in a shootout with police.

The government tried to force Apple to write software to open the device, but the technology company resisted, saying that such an action could compromise the security of millions of other phones.

That court case came to an abrupt end in March, when the FBI said it no longer needed Apple’s help because an unidentified third party had shown it a way to bypass the phone’s security features.

Users’interest should drive encryption policy: IAMAI

Users'interest should drive encryption policy: IAMAI

Encryption is a fundamental and necessary tool to safeguard digital communication infrastructure but the interests of Internet users should be foremost in framing any policy, the Internet and Mobile Association of India (IAMAI) said here on Tuesday.

“Trust, convenience and confidence of users are the keywords to designing an ideal encryption policy that will help in getting more people online with safe and secured internet platforms,” said IAMAI president Subho Ray.

The association, which has published a discussion paper on encryption policy, suggests that a broad-based public consultation with all stakeholders including users groups should precede making of an encryption policy.

According to the paper, the foundation of a user centric encryption policy consists of freedom of encryption, strong encryption base standard, no plaintext storage and mandatory legal monitoring or no backdoor entry.

An essential element in the suggestion that support for strong encryption is critical to counter cyber security issues around the globe, but also pitches for the importance of freedom of encryption for the users, organisations and business entities.

Encryption; Friend of Freedom, Guardian of Privacy

The issue of government access to private encrypted data has been in the public eye since the San Bernardino shootings in December, 2015. When an iPhone was found the FBI requested that Apple write code to override the phone’s security features. The FBI was ultimately able to decrypt the phone without Apple’s assistance. However, the ensuing debate over encryption has just begun.

High profile criminal and national security issues serve to shed light on an issue which is pervasive throughout the country. Local governments presumably have thousands of devices they would like to decrypt for investigatory purposes as New York City alone has hundreds. Seeking a resolution and remembering the horrific terror attacks of September 11, 2001 New York State Assembly Bill A8093A is in committee and seeks to outlaw the sale of phones in the state which have encryption not by passable to law enforcement.

Encryption allows for the safe keeping and targeting dissemination of private thoughts and information without worry off judgment, retaliation or mistreatment. On a grander scale encryption prevents unchecked government oversight. It can be argued that encryption technology is a hedge against current and future totalitarian regimes. With a history of occupation and abuse of power it is no surprise that Germany and France are not pushing for encryption backdoors.

Backdoors in encrypted devises and software provide another avenue for unwelcome parties to gain access. Hackers are often intelligent, well-funded and act on their own, in groups and most harmfully with foreign entities. Holes have a way of being found and master keys have a way of being lost.

Senators Richard Burr and Diane Feinstein are undoubtedly well intended with their draft law entitled the Compliance with Court Orders Act of 2016. The act calls for providers of communication services including software publishers to decrypt data when served with a court order. The data would have to be provided in an intelligible format or alternatively technical assistance for its retrieval. Prosecutors have a need to gather evidence. Governments have a duty to prevent crime and acts of terror.

However, experts question the feasibility of building backdoors into all types of encryption as it comes in many forms and from a host of global providers. Further, there is concern that the measure, if adopted, will backfire as the targeting of backdoors by our adversaries is assured. Cyberwar in the form of illicit data collection, theft of trade secrets and access to infrastructure is all too common and may escalate as tensions rise between adversaries. Ransomware and cyber extortion have been spreading, most recently at hospitals, and the knowledge of the existence of backdoors will motivate those who seek unseemly profits.

Efforts to prosecute the accused, fight crime and terror are noble causes. However, government should be wise in the approach lest we weaken our shared defenses in the process. The big corporate names of Silicon Valley recognize the dangers of backdoors and are speaking out and lobbying against Senator Burr and Feinstein’s efforts. The draft legislation does ensure that the monetary cost of decrypting is paid to the, “covered entity.” However, the costs to society at large remain up for discussion.

The encryption challenge

The encryption challenge

IT managers know the movies get it wrong. A teenager with a laptop cannot crack multiple layers of encryption — unless that laptop is connected to a supercomputer somewhere and the teenager can afford to wait a few billion years.

Encryption works. It works so well that even the government gets stymied, as demonstrated by the lengths to which the FBI went to access an iPhone used by one of the San Bernardino, Calif., shooters.

So in the face of ever more damaging stories about data breaches, why aren’t all government agencies encrypting everything, everywhere, all the time?

Encryption can be costly and time consuming. It can also be sabotaged by users and difficult to integrate with legacy applications.

Furthermore, according to a recent 451 Research survey of senior security executives, government agencies seem to be fighting the previous war. Instead of protecting data from hackers who’ve already gotten in, they’re still focusing on keeping the bad guys out of their systems.

Among U.S. government respondents, the top category for increased spending in the next 12 months was network defenses — at 53 percent. By comparison, spending for data-at-rest defenses such as encryption ranked dead last, with just 37 percent planning to increase their spending.

Part of the reason for those figures is that government agencies overestimate the benefits of perimeter defenses. Sixty percent said network defenses were “very” effective, a higher percentage than any other category, while government respondents ranked data-at-rest defenses as less effective than respondents in any other category.

There was a time when that attitude made sense. “Organizations used to say that they wouldn’t encrypt data in their data centers because they’re behind solid walls and require a [password] to get in,” said Steve Pate, chief architect at security firm HyTrust.

That attitude, however, runs counter to the modern reality that there is no longer a perimeter to protect. Every organization uses third-party service providers, offers mobile access or connects to the web — or a combination of all three.

A security audit at the Office of Personnel Management, for example, showed that use of multifactor authentication, such as the government’s own personal identity verification card readers, was not required for remote access to OPM applications. That made it easy for an attacker with a stolen login and password to bypass all perimeter defenses and directly log into the OPM systems.

An over-reliance on perimeter defenses also means that government agencies pay less attention to where their important data is stored than they should.

According to the 451 Research survey, government respondents were among those with the lowest confidence in the security of their sensitive data’s location. Although 50 percent of financial-sector respondents expressed confidence, only 37 percent of government respondents could say the same.

In fact, only 16 percent of all respondents cited “lack of perceived need” as a barrier to adopting data security, but 31 percent — or almost twice as many — government respondents did so.

Earlier this year, the Ponemon Institute released a report showing that 33 percent of government agencies use encryption extensively, compared to 41 percent of companies in general and far behind the financial sector at 56 percent. In that survey of more than 5,000 technology experts, 16 percent of agency respondents said they had no encryption strategy.

On a positive note, the public sector has been making headway. Last year, for example, only 25 percent of government respondents to the Ponemon survey said they were using encryption extensively.

“This is showing heightened interest in data protection,” said Peter Galvin, vice president of strategy at Thales e-Security, which sponsored the Ponemon report. High-profile data breaches have drawn public attention to the issue, he added.

ON ENCRYPTION: THERE ARE NO LOCKS ONLY “ANGELS” CAN OPEN

Despite the FBI dropping its case against Apple over whether or not the tech giant should supply the government agency with the ability to hack into the San Bernardino shooter’s iPhone, the argument over how our devices — especially our phones — should be encrypted continues to rage on. And regardless of how you feel about the issue, almost everybody agrees that the debate can be pretty murky, as privacy vs. protection debates usually are. To make the whole argument a lot easier to digest however, we have one of the web’s best educators and entertainers, CGP Grey, who has broken it all down in one clear five-minute video.

The video, posted above, parallels physical locks with digital “locks” (encryption), noting how they relate and how they differ in order to help us better understand the encryption debate. And one of the most important points that Grey makes about digital locks is that they need to work not only against local threats, but threats from across the globe — threats coming from “internet burglars” and their “burglar bots.”

Grey touches on the scenario in which a bad guy with an armed bomb dies, leaving behind only an encrypted phone with the code to stop the bomb. In this particular case — a parallel to the San Bernardino shooter case, as there may have been information regarding further threats on his encrypted phone — Grey points out that this may be a time when we’d want the police to have access, or a “backdoor,” to the phone. But if companies were forced to build backdoors into their products so government agencies could use them for situations like these, could we trust authorities not to abuse their powers? Could we trust that “demons” (people with bad intentions) wouldn’t hijack the backdoors?

Grey argues that we couldn’t, saying that “there’s no way to build a digital lock that only angels can open and demons cannot.”

There’s also a bonus “footnote” video (below) in which Grey discusses just how intimate the data on our phones has become (do you remember where you were on April 8th at 6:02AM? No? Your phone does).

What do you think about CGP Grey’s breakdown of the encryption debate?

Moot point: Judge closes iPhone encryption case in Brooklyn

1

The United States Justice Department said on Friday that it has withdrawn a request compelling Apple Inc to cooperate in unlocking an iPhone related to a drug case in NY following a third-party providing a passcode to the authorities to access the handset.

“An individual provided the department with the passcode to the locked phone at issue in the Eastern District of New York”, Justice Department spokesman Marc Raimondi said in a statement.

On Friday, the Justice Department told a federal court in Brooklyn that it would withdraw the motion to force Apple to pull data from a drug dealer’s locked iPhone, The Washington Post reported.

Investigators have dropped the court case against Apple as they have successfully gained access to the iPhone 5s involved in the NY drug case.

There are about a dozen other All Writs Act orders for Apple’s assistance with opening for other devices that are unresolved, but are not in active litigation, according to a Justice Department official. Apple, meanwhile, demanded to know in the NY case whether the government had exhausted all other options to get to the data.

The company said it “strongly supports, and will continue to support, the efforts of law enforcement in pursuing criminals”, but not through the government’s misuse of a law it wants to use as a “precedent to lodge future, more onerous requests for Apple’s assistance”.

The case dates back to 2014, when authorities seized the iPhone 5s of the suspect Jun Feng. Feng pleaded guilty in October to conspiring to distribute methamphetamine and is scheduled to be sentenced in June. Comments attributed to Apple’s attorneys also suggest that while the company isn’t aware of the method used, it’s convinced that normal product development is eventually going to plug whatever exploit was used to gain access to that iPhone.

According to the Wall Street Journal, that “individual” is Feng himself, who has already been convicted and only recently became aware that his phone was the subject of a national controversy.

The case began on February 16 with an order from Judge Sheri Pym and ended on March 28 when the Justice Department withdrew its legal actaion against Apple.

As a result, Comey’s remarks strongly implied that the bureau paid at least $1.3 million to get onto the phone, which had belonged to Syed Rizwan Farook, who, with his wife, killed 14 people during the December 2 terror attack in San Bernardino, Calif. 、

FBI: Encryption increasing problem

FBI: Encryption increasing problem

The FBI is facing an increasing struggle to access readable information and evidence from digital devices because of default encryption, a senior FBI official told members of Congress at a hearing on digital encryption Tuesday.

Amy Hess said officials encountered passwords in 30 percent of the phones the FBI seized during investigations in the last six months, and investigators have had “no capability” to access information in about 13 percent of the cases.

“We have seen those numbers continue to increase, and clearly that presents us with a challenge,” said Hess, the executive assistant director of the FBI branch that oversees the development of surveillance technologies.

In her testimony to a subcommittee of the House Energy and Commerce Committee, Hess defended the Justice Department’s use of a still-unidentified third party to break into the locked iPhone used by one of the two San Bernardino, California, attackers. But she said the reliance on an outside entity represented just “one potential solution” and that there’s no “one-size-fits-all” approach for recovering evidence.

Representatives from local law enforcement agencies echoed Hess’s concerns. Thomas Galati, chief of the intelligence bureau at the New York Police Department, said officials there have been unable to break open 67 Apple devices for use in 44 different investigations of violent crime — including 10 homicide cases.

Still, despite anxieties over “going dark,” a February report from the Berkman Center for Internet and Society at Harvard University said the situation was not as dire as law enforcement had described and that investigators were not “headed to a future in which our ability to effectively surveil criminals and bad actors is impossible.”

The hearing comes amid an ongoing dispute between law enforcement and Silicon Valley about how to balance consumer privacy against the need for police and federal agents to recover communications and eavesdrop on suspected terrorists and criminals. The Senate is considering a bill that would effectively prohibit unbreakable encryption and require companies to help the government access data on a computer or mobile device when a warrant is issued. Bruce Sewell, Apple’s general counsel, touted the importance of encryption.

“The best way we, and the technology industry, know how to protect your information is through the use of strong encryption. Strong encryption is a good thing, a necessary thing. And the government agrees,” Sewell testified.

How Apple makes encryption easy and invisible

How Apple makes encryption easy and invisible

Do you know how many times a day you unlock your iPhone? Every time you do, you’re participating in Apple’s user-friendly encryption scheme.

Friday, the company hosted a security “deep dive” at which it shared some interesting numbers about its security measures and philosophy as well as user habits. To be honest, we’re less concerned with how Apple’s standards work than the fact that they do and will continue to. But that’s kind of the point behind the whole system — Apple designed its encryption system so that we don’t even have to think about it.

Apple’s encryption and security protocols have faced a ton of scrutiny during its recent showdown with the government. And if anything, that debate has gotten more people thinking seriously about how data can and should be secured. And the topic is not going away for a while.

We weren’t there Friday, but Ben Bajarin from Techpinions offers some great analysis, and his piece includes some really cool stats. For one, Apple says that the average user unlocks their phone 80 times a day. We don’t know if that’s across all platforms or just iOS. It sounds a little low in my case, however, because I’m generally pretty fidgety.

But because people are checking their phones so often, it’s important for Apple developers to make encryption powerful without causing the end user frustration. Like if they could just plunk their thumb down, and their phone would unlock, for example.

89 percent of people who own Touch ID-enabled devices use the feature, Apple says. And that’s a really impressive adoption rate, but it makes sense when you think about how much easier the biometric system is to use than a passcode.

Passcodes are great, of course, and you have to have one. But as an experiment a while ago, I turned off Touch ID and went numbers-only to unlock my phone. And guess what? It was really annoying. I switched the feature back on by the end of the day.

Apple also talked up its so-called Secure Enclave, which is its slightly intimidating name for the single co-processor that has handled all encryption for its devices since the iPhone 5s. Each Enclave has its own, unique ID that it uses to scramble up all of the other data for safekeeping. And neither Apple nor other parts of your phone know what that UID is; it all just happens on its own. And that’s pretty much how we prefer it.

Apple, FBI set to resume encryption fight at House hearing

The encryption battle between Apple and the FBI is moving from the courtroom to Congress next week.

Representatives from the tech titan and the federal law enforcement agency are scheduled to testify Tuesday before the House Energy and Commerce Committee about the debate over how the use of encryption in tech products and services hampers law enforcement activities.

In February, Apple clashed with the FBI over whether the company would help investigators hack into the encrypted iPhone of San Bernardino shooter Syed Farook. That case ended when the FBI said it had found a way to unlock the phone without Apple’s help. The debate, however, is unresolved.

Technology companies and rights groups argue that strong encryption, which scrambles data so it can be read only by the right person, is needed to keep people safe and protect privacy. Law enforcement argues it can’t fight crimes unless it has access to information on mobile devices.

The hearing, called “Deciphering the Debate Over Encryption: Industry and Law Enforcement Perspectives,” will include two panels. The first features Amy Hess, executive assistant director for science and technology at the FBI, who will speak about law enforcement concerns along with other law enforcement officials from around the country. Apple general counsel Bruce Sewell will speak during a second panel, which will feature computer science and security professionals.

The FBI and Apple did not immediately respond to requests for comment on their testimony.

The hearing’s agenda comes just a day after a US Senate encryption bill was released that would give law enforcement and government investigators access to encrypted devices and communications. Authored by US Sens. Dianne Feinstein and Richard Burr, the bill furthers a fight that pits national security against cybersecurity.

Earlier this month, Facebook complicated things a bit further for the FBI when it announced that all communications sent on its popular WhatsApp messaging app are now encrypted.

Feinstein encryption bill sets off alarm bells

Feinstein encryption bill sets off alarm bells

A draft version of a long-awaited encryption bill from Sens. Dianne Feinstein, D-Calif., and Richard Burr, R-N.C., was leaked online last week, and the technology industry is already calling foul.

The bill requires any company that receives a court order for information or data to “provide such information or data to such government in an intelligible format” or to “provide such technical assistance as is necessary to obtain such information or data in an intelligible format.” It doesn’t specify the terms under which a company would be forced to help, or what the parameters of “intelligible” are.

The lack of these boundaries is one of the reasons why the backlash to the bill — which isn’t even finished — has been so fast and overwhelming. Kevin Bankston, director of the Open Technology Institute, called it “easily the most ludicrous, dangerous, technically illiterate proposal I’ve ever seen.”

It’s disheartening that the senators intend to continue pressing on with this bill, especially in light of the FBI’s recent bullying of Apple. After the FBI bungled its handling of the San Bernardino shooter’s phone, it tried and failed to force Apple into creating a new program that would let it hack into not just the shooter’s phone but probably many other phones as well. When Apple resisted, the FBI mysteriously came up with a workaround. Small wonder other technology companies are reacting poorly to this Senate bill.

Feinstein’s staffers said that the issue is larger than one phone. That’s true — and it’s exactly why such a broad proposal should make everyone who uses a smartphone uneasy. Giving law enforcement such a broad mandate would inevitably lead to questionable decisions, and it would weaken Internet security for everyone.

Feinstein’s staff also said that the reason for the bill’s vagueness is that the goal is simply to clarify law, not to set a strict method for companies or to tell the court what the penalties should be should companies choose not to follow orders. That sounds good in theory. In practice, Feinstein and Burr would be well-advised to go back to the table with technology interests — and really listen to their concerns.