Why You Should Care About the Government's Demand for an iPhone Backdoor — And Apple's Refusal to Comply

It didn't happen to you...yet.
Publish date:
February 18, 2016
apple, security theatre, security, terrorism

Apple has found itself at the center of a key privacy battle this week over a court order mandating that it engineer a backdoor to an iPhone used by Syed Rizwan Farook, one of the San Bernardino shooters. The FBI claims that it needs Apple's help to unlock the phone, bypassing the auto-erase feature that protects Farook's data. Apple says that complying with the FBI's demands would set a dangerous precedent, and that precedent is an issue that everyone should be deeply concerned about.

In simple terms, the FBI is facing a problem with Farook's phone: The software used to lock it has an autoerase feature that will scrub the data on the phone after too many false login attempts. It's a security measure designed to ensure that private data stays out of the wrong hands, and it's one that the federal government vigorously opposed when both Google and Apple announced that they were equipping their next generation phones with more robust privacy features. Agencies like the FBI and NSA claimed that they needed backdoor access for situations exactly like this one — cracking the phone is functionally impossible, leaving information that they say could be valuable to the investigation untouchable.

Apple and privacy advocates disagree with demands to install a backdoor, for a very obvious reason: Once it's there, anyone can use it. That might include government agencies conducting unauthorized or legally questionable searches and monitoring, or hackers, or even hostile foreign governments trying to oppress their own citizens. In this instance, the FBI is asking them to engineer a very complicated "skeleton key" to let the agency cut through the security features to get to the heart of the data on the phone. It's important to note that the FBI isn't demanding help with decrypting the phone: First, they need access to the data they want to decrypt. So this isn't about Apple providing free tickets to their encryption software.

But once the door is open, the FBI can set itself to applying complicated decryption algorithms to the phone's data. And once the precedent of compelling Apple to break ranks with the trust of its customers is already set, it becomes that much easier to push for another punch to the slowly eroding privacy rights of Americans. While encryption programs remain as safe as they're engineered to be for now, that doesn't mean they'll stay that way.

In this case, a federal court ordered Apple to comply with the government's request after the FBI asked and Apple officials said no. Now, Tim Cook has taken to the Internet with an extremely strongly-worded open letter defending his decisions, explaining why privacy is important, and announcing the company's intent to keep protecting its customers.

"The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control," he wrote, adding that "The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices." He noted that this could have a deeply chilling effect.

White House officials and the Department of Justice insist this only applies to one phone, but that's like saying a bump key will only work on your own front door, so it's perfectly reasonable to manufacture and own one. As any locksmith will tell you, the whole point of having bump keys and related tools is to have access to any door, not just one. The government wants clear and unfettered access to Farook's phone, but that translates to the same level of access to any phone running iOS, and, by extension, the legal precedent to compel other device manufacturers to provide the same functionality.

The government is trying to apply a law you might not have heard of before this week: The All Writs Act of 1789. The bizarrely broad law "simply allows courts to issue a writ, or order, which compels a person or company to do something," explains Cyrus Farivar at Ars Technica. However, the law includes the clause that the writ in question needs to be "agreeable to the usages and principles of law," and it is this that Apple is relying on to defend itself. In Supreme Court cases, justices ruled that the law cannot be used to bypass the Constitution, nor can it pose an undue burden — like, say, having to effectively engineer an entirely new operating system.

This would be an unprecedented breach of privacy in a security landscape already littered with erosions to an American right so fundamental that it is enshrined in the Constitution: The protection from unlawful search and seizure. America had, at one point, some of the most robust privacy protections in the world, but over the last 15 years, the government has systematically dismantled them under the guise of anti-terrorism. Warrantless wiretapping is merely one example of a long line of measures that make it very difficult for Americans to lead private lives, and court after court has affirmed the government's demands to overreach its authority.

Apple doesn't condone terrorism or the use of its technology in terrorism. Neither would most reasonable people. But security theater cannot and should not be used to ride roughshod over our civil rights, and that's why this case is so critical. If the government can establish that Apple is legally required to engineer code to break its own security features, it can do the same to Google and other companies that act to protect their customers.

That's a prospect that should be worrying, because it could be your device next. This technology would allow anybody with physical access to your phone to unlock it, whether that be law enforcement, the person who steals it from you on the subway, or someone who plans to blackmail you with content you'd rather not share with the public. Electronic media is never wholly safe, but we shouldn't be going out of our way to make it easier to crack, which is what the government is demanding in this case — and the government has repeatedly demonstrated that it cannot be trusted with our information.

There's another important implication to this case, which will likely proceed up the judicial ladder and through a series of courts, potentially landing in front of the Supreme Court. With an empty seat on the bench, the court is in a position to swing in a variety of directions on privacy issues, and this is something the public should be considering when evaluating any Obama appointee — not least because the president has disregarded constitutional rights to privacy on multiple occasions, and clearly doesn't consider it a priority for Americans.

Image credit: Gonzalo Baeza/CC