Apple and the FBI
The government wants Apple to unlock a terrorist’s iPhone. Are they asking the impossible?
In the wake of the December terrorist attack at Naval Air Station Pensacola, in which three service members lost their lives, the federal government has renewed calls for Apple to provide a “backdoor” by which authorities can bypass the encryption on iPhones.
In this brief article, we’ll look at the background to this most recent clash between Apple and the U.S. Department of Justice, and we’ll examine the issue from a technical perspective as well, explaining just what it would mean for Apple to break its own encryption.
The background
Apple is a famously privacy-minded company, which makes them something of an outlier in Silicon Valley. They intentionally limit the amount and type of user data processed by and stored on their servers. Their native messaging service, iMessage, provides end-to-end encryption by default. The newest version of iOS is designed to give users unprecedented control over how much location information third-party apps are allowed to collect. And the contents of every iPhone are encrypted at the device level, meaning that even Apple can’t access the data stored on a user’s phone.
This last point has led to legal clashes before, most notably in 2016, after Apple challenged a court order requiring them to help the FBI access the encrypted contents of an iPhone in connection with the investigation into the 2015 San Bernardino terrorist attack.
The matter made it before Congress, with then-FBI Director James Comey and Apple’s legal counsel arguing their respective cases before the House Judiciary Committee. The dispute was still unresolved when the government unexpectedly announced that it had found a third party capable of unlocking the device, rendering the legal dispute moot.
New administration, same arguments
After the shooting in Pensacola, the FBI requested Apple’s assistance in the investigation — and Apple complied, providing access to the attacker’s iCloud data as well as other information.
But the FBI had also asked Apple to unlock the phone itself so that they could determine whether or not the terrorist had acted alone — and as in 2016, Apple said that they were unable to do this.
This has drawn harsh criticism from U.S. Attorney General William Barr, who rebuked Apple for failing to offer “any substantive assistance” in the investigation. Some members of Congress have been even more vehement, with Republican Senator Tom Cotton going so far as to accuse the company of “siding with terrorists”. Even President Trump has gotten involved, calling for Apple to “step up to the plate” and comply with the FBI’s requests.
Apple has doubled down on its commitment to strong encryption, saying in a public statement:
“We have always maintained there is no such thing as a backdoor just for the good guys. Backdoors can also be exploited by those who threaten our national security and the data security of our customers”.
Meanwhile, civil liberties groups have spoken up in support of Apple, with the ACLU commenting:
“This is bigger than any single iPhone: The government’s demand would weaken the security of millions of iPhones, and is unconstitutional”.
Clearly there are two very different views on the issue. But leaving politics aside, what would it mean, from a technical perspective, for Apple to create a “backdoor” that would enable the government to access an encrypted device? And what are the security ramifications of building such a tool?
GovtOS 1.0
The iPhone is a very secure device, with built-in safeguards to prevent anyone from trying to do what the FBI is trying to do. These protections are “baked in” to the core design of an iPhone’s hardware and software, such that even Apple itself is unable to access a properly functioning device. This is attractive from a user-privacy standpoint, but understandably frustrating for law enforcement personnel conducting legitimate investigations.
In order to bypass a device’s encryption, Apple would essentially have to create a tool to subvert the core functionality of iOS and the iPhone itself. Such a tool could potentially be used on any iPhone, not simply the iPhone of a suspect under investigation. In the San Bernardino case, for example, the court order demanded that Apple engineer a single-use version of iOS to be installed on the locked phone — minus several key security features. In particular, the FBI wanted Apple to disable the device’s auto-erase functionality, which normally deletes encryption keys after 10 failed access attempts, as well as provide a way to electronically input passcodes through the iPhone’s physical port. Any device running this “less secure” OS could be unlocked through brute force methods: by trying an unlimited number of passcodes until the correct one was found. The FBI attempted to reassure Apple that this could be done so as to guarantee true “one-time” use. Apple was unconvinced.
Apple’s main objection to creating what it has called “GovtOS” is that, once in existence, it would be potentially available to anyone who could get their hands on it: a clear privacy risk to millions of iPhone users worldwide. And as we will see, such fears are not unfounded.
The problem with backdoors
The main issue with creating an iOS backdoor, from Apple’s point of view, is that the only way to do it would be to create a less safe version of its own product — which would expose millions of its customers to potential danger. Apple’s position, as outlined in a public letter during the San Bernardino debate, is that this kind of request is unheard of in the history of U.S. law enforcement, and would, if accepted, set a troubling legal precedent.
There is also, of course, the question of civil liberties — and in particular, whether or not the government can be taken at its word when it says that it will only use backdoors in specific cases. There has already been evidence of the CIA covertly exploiting software vulnerabilities in order to gather intelligence, and although the law enforcement agencies say that they always obtain warrants before monitoring a citizen’s communications, these warrants are often approved on the strength of disturbingly vague reasons.
But even leaving these legal considerations aside, it’s also worth considering whether or not the government could actually prevent a backdoor from falling into the wrong hands. Leaked NSA tools have already been implicated in high-profile ransomware attacks, raising questions about how secure an iOS backdoor would be in the government’s care. In addition, the U.S. government is itself not immune to spying: Double agents like Robert Hanssen are discovered on occasion, sometimes after operating undetected for years. At a time when many are concerned over foreign interference in U.S. elections — and as mobile voting is poised to go mainstream — the prospect that someone within the government could intentionally leak an iOS backdoor to a foreign intelligence agency is deeply troubling.
Finally, creating a backdoor would also create an incentive for the software engineers who built it to use their knowledge for personal gain — either on their own or by selling information to the highest bidder. The vast majority of high-level developers at Apple are no doubt ethical, but it would only take one person “going to the dark side” in order to compromise the privacy and security of millions. In the eyes of many, this is an unacceptable risk.
The cost of freedom
The government will likely continue to pressure Apple to provide backdoor access to iOS. But Apple has good legal and technical reasons to push back, and their position on the issue has been consistent for years. If a compromise can’t be reached, we may see legislation compelling Apple (and other tech companies) to provide the government with the kind of access they want. If that happens, the likely result will be a legal battle — one which could make it all the way to the Supreme Court.
How such a case would be decided is anyone’s guess, but there is some reason to think that the law might end up viewing encryption as a net positive for society. Unbreakable encryption undoubtedly hampers criminal investigations from time to time, but the privacy protections that it offers to individuals as well as the body politic may be, on balance, worth the sacrifice. In this sense, encryption is somewhat akin to other aspects of legal systems committed to individual rights and due process. Civil liberties protections afforded to citizens can also be inconvenient for law enforcement, and occasionally result in criminals escaping justice. But this is generally accepted as the price which must be paid in order to live in a free society. Only time will tell if the law will one day see strong encryption in the same way.