Thursday, February 18, 2016

Apple vs. the FBI: This may not be a war Apple can win

While Apple has come out strongly in favor of security and privacy, it's facing an uphill battle with the FBI and other government security organizations. David Gewirtz looks at the reasons why Apple may be right, but also why its attempt to block the FBI's order may prove futile.

tim-cook-cybersecurity-3255.jpg

This... will not end well.

What I'm talking about, of course, is the recent Tim Cook open letter to Apple customers stating that Apple will not comply with the FBI's demand to provide customized access technology to break into an iPhone used by the San Bernardino killers.

CBS News: Should Apple help the FBI access iPhone data?

As soon as Cook published his statement, nearly all news organizations, bloggers, and even activists jumped onto the issue. Because so much has already been written, I'm not going to repeat the details of the FBI demand, discuss the technical merits, or look at the legal basis. Click the links in that last sentence to get a quick overview.

Instead, I'm going to look at two factors: the security dangers such an action might present, and whether or not Apple stands a chance of coming out of this without being ground down into apple sauce.

THE SECURITY DANGERS

For general background, I've written extensively about the risks of providing back doors and how back doors can compromise national security:

Smartphone encryption ban? It's a boon for criminals and terrorists

Encryption is not the enemy: A 21st century response to terror

In this specific case, however, we're talking about breaking into an iPhone 5c. The 5c, you'll recall, does not support Touch ID and, therefore, doesn't have the hardware secured enclave capabilities of more modern phones. So, while it's an aphorism that any security can be cracked eventually, it is substantially harder to break through hardware encryption -- too difficult to be meaningful in a timely manner.

But, because the 5c is less secure, it is reasonable to assume that custom penetration software in the form of a unique iOS image could be written, and used to help the FBI extract information from the phone.

Let's be clear that in this instance, the FBI is not asking Apple to put a back door into iOS. 

They are asking Apple to create a special version of iOS, that could be loaded on this specific phone, to help break into this specific phone.

This is a subtle, but important difference. Your copy of iOS would not contain the weakened security. Just the copy loaded on this one phone. Or at least that's the FBI's premise.

It is a fair premise, except for the fact that it's also a naive premise. The FBI says that a single unit copy of the code would be used for this one phone, even to the point of ID-limiting the code to this one phone.

The naivety comes into play when thinking that either the government or Apple could absolutely, positively, without any doubt, ensure that this code didn't get into the wild. This would mean that Apple could have absolutely no security breaches. But we've seen they make mistakes.

This would mean that the government could have no security breaches. But as we've seen from OPM, other breaches, and even older examples, the government's security is porous. It would also mean that every FBI employee with a security clearance could be trusted implicitly, but we've seen terrible examples of where trusted government employees and contractors have stolen critical information.

We've seen government employees sell drugs from the office. We've even seen members of Hillary Clinton's State Department use the passport database to spy on celebrities, actors, comics, musicians, politicians, athletes, models, members of the media, family members, and friends.

My point here is simple. Neither Apple nor the government can be sure that a tool designed to break into iPhone security will never fall into the wrong hands.

Now, let's be clear. There are already hacks available for the iPhone. There have always been mechanisms to jailbreak iPhones. While, in this particular case, a custom Apple iOS hacking tool might make things easier for the FBI, broken forms of iOS already do exist in the wild.

But a hacked version of iOS is different from something specially built and compiled from source. It's important not to underestimate the risk of what happens when something dangerous -- like a fully open and security-nerfed version of iOS -- gets into the wild.

Take Stuxnet, for example. Stuxnet, if you recall, was government-scale malware designed to destabilize centrifuges in Iran. Back in 2012, I led a team that included senior White House officials in a nationwide cyberattack simulation based on the question of what might happen if Stuxnet got released into the wild. The problem of releasing it into the wild is that once it's out there, the bad guys could reverse engineer it, and launch Stuxnet-like attacks back at us.

Apple is basically saying that if they release a compromised iOS, it will travel, it will not be secured by the FBI (a fair concern given OPM), and it will put Apple's promise of security at risk. These are fair and reasonable statements.

In short, if Apple enables even a single-use back door, the existence of such a thing will undoubtedly be used against us. Even a special-case one-off crack like the one the FBI has requested could break lose and compromise security.

INTERNATIONAL IMPLICATIONS

There are international implications that Cook's letter does not address. By the way, if you want to be able to read this document in the future, grab yourself a copy of it. Given the URL of apple.com/customer-letter/, it's moderately likely that URL will be used for some other communication in the future, overwriting this important document.

But, back to the international implications. Cook talks about the FBI's interpretation of a 227-year-old law called the All Writs Act of 1789. This is a very broad-based statute which, essentially, opens the door for all orders to be interpreted by the courts when pre-existing law doesn't exist. Disclaimer: that's an over-simplification. It will be discussed at length across the Web. Google it.

The thing is, that's a U.S. statute. Other governments, especially more authoritarian regimes, just demand what they want and expect to get it. Does anyone seriously expect, if the FBI has an iPhone hacking tool. that the Russians or the Chinese won't demand access to the same technology?

This is where "we have it but we won't give it to you" differs from "we've never built it, never want to, and never will." I have no doubt that other countries, especially ones that offer huge customer bases and sales growth to Apple (can you spell China?) would demand that Apple provide similar break-in tech, possibly even as a currency for entrance in their economy.

Microsoft, for example, has already turned over source code for desktop Windows and Windows Server to Russia's Federal'naya sluzhba bezopasnosti Rossiyskoy Federatsii. The FSB is present-day Russia's successor to the infamous Soviet-era KGB.

China has already set up regulations requiring foreign technology companies to provide source code to their systems. China is also insisting that U.S. companies provide decryption keys or back doors to products they want to sell in that nation.

Is it a big leap to think that once an encryption-free iOS version exists, China and Russia will demand it for their own use? Is it even a big leap to think that once it exists, China and Russia will insist it be the primary version of iOS distributed to customers in their countries?

To be sure, Apple is also drawing a line in the sand, basically using San Bernardino as the test case for whether they can be compelled to engineer against their own best interests. I'm guessing they want to fight this battle now, when it's a defined battle with defined parameters, than have to fight it at a later time in the heat of urgency.

WHAT IF APPLE COMPLIES?

Already, activists are getting together to protest the potential of Apple providing access to the iPhone to the FBI. While there is no doubt Apple can weather the PR storm, events like protests outside of their stores is clearly not something the company wants.

That said, there's no doubt the company gamed out many of the possible scenarios prior to Tim Cook publishing his manifesto. They clearly think the no-we-won't approach meets with their goals and values.

My wife asked me a question that relates to this. She said, "Well, what if Apple tells the FBI to bring in the phone, and they'll look at it in Cupertino? What's wrong with letting the FBI bring in phones when they need to, but not releasing any code?"

I talked above about the risk of the code's very existence, but the idea of helping the FBI track down possible co-conspirators involved in the heinous San Bernardino attack is something most Americans should support. After all, these were bad people and we don't want to let something like it happen again.

But... for all the reasons discussed above, if Apple were to comply this once, it sets a very dangerous and far-reaching precedent that may not be as clear cut in the future.

One of Apple's selling points is that its iOS devices are more secure than Android devices. Apple has worked hard and braved a lot of criticism to provide this level of security. Customers have come to expect enhanced security, and to accept the trade-offs inherent in their choosing Apple products because of it.

Were Apple to sacrifice that security and privacy on a larger scale, there looms the possibility of loss of credibility and possibly market share. Were security-nerfed devices to be cracked on a regular basis, lawsuits would undoubtedly abound.

And here's one more point to consider: with Apple having such a high value, a substantial loss by the company could drop its stock to the point that the entire economy could suffer as a result.

THE DANGERS IF IT DOESN'T

Here is Apple's biggest risk. If Apple refuses to help the FBI and, as a result, another terrible attack that might otherwise have been prevented is not prevented, Apple will have blood on its hands.

Heretofore, the worst Apple has done is to publish driving directions that strand drivers in inconvenient locations. But if Apple can be implicated in holding back information that could have prevented another San Bernardino or -- far worse -- another 9/11, the company might not live down the righteous anger that circumstance might foster.

WHY APPLE MAY NOT WIN

This may not be a war Apple can win. Both approaches are rife with potential disasters. On one hand, Apple is sacrificing customer privacy. On the other, they may be helping terrorists get away with murder.

But let's bring this back from a philosophical discussion to a practical one. Apple, as big and rich and successful as it is, is unlikely to prevail against superpower world governments.

Take America, for example. Let's say Apple refuses to comply with the FBI directive. What could happen? There's probably a long legal path for this, but nearly all Apple products have to pass through the ports. The government controls the ports.

If Apple doesn't comply with the FBI directive, a worst-case scenario is that all those iPhones would be blocked from entering the country.

Or what if people die and that can be attributed to Apple's inaction? While I certainly can't quote you case law, could Tim Cook and other Apple executives be prosecuted according to various anti-terrorism statutes? It's possible. So, could an Apple executive wind up behind bars? How many people would have to die before the mob demands it?

What about China? China's government is already struggling with unrest. That government has a potential death grip on Apple. Not only is Apple counting on the huge potential customer base in China for revenue growth, but nearly all of Apple's primary manufacturing facilities are in the country.

Does anyone think China won't make demands against Apple? China will demand what it wants, and Apple will either be forced to comply, or suffer vast supply chain and distribution channel damage.

The point of all this is that while Apple may be fighting the good fight (and that's actually hard to determine given that both sides have merits), the force of determined governments can reach through Apple's reality distortion effect and impose harsh reality instead.

While it's possible that through the force of customer loyalty, good public relations, and an extensive lobbying effort Apple may prevail, I wouldn't take odds on that bet.

FREEDOM VS. SECURITY

As a nation, we have to make a huge trade-off decision -- and most government officials are making the wrong one. Do we compromise security now for the expediency of easy investigation? Or do we preserve everyone's security at the possible risk that someone who has been secured is a bad guy?

This is not a new discussion. The inherent challenge of freedom is that it is often tempting to sacrifice some of it at the expense of easy security. As a nation, we've always been willing to take a bit of a risk that freedom for everyone opens the doors to bad guys, but it's worth it because the essential freedom makes us strong enough to handle these threats.

That said, that's only the value system of some nations, like the United States. Apple is competing globally, and most civilized nations are facing very uncivilized threats. It's unlikely the freedom argument will win in the long term, when thousands of lives across hundreds of nations are at stake.

Apple CEO Tim Cook says the company will fight a court order that demands it make a custom version of iOS for the FBI.


No comments:

Post a Comment