Apple, Back Doors and the City on a Hill

How long will the FBI take to find the key?

How long will the FBI take to find the key?

Bad facts make bad law. This well-worn legal aphorism may well describe the state of American privacy law if the FBI is successful in its bid to compel Apple to write a special version of its iPhone operating system to provide a “back door” into one of its legacy devices.

Ostensibly, the question presented is whether Apple can refuse a court order to assist the FBI in gathering evidence from one specific phone provided by the employer of a dead terrorist. However, the potential precedential significance of the case is far greater.

The late Syed Rizwan Farook, together with his wife, Tashfeen Malik, killed 14 people and wounded 22 others in San Bernardino, California before meeting a swift and just end. As part of its investigation, the Federal Bureau of Investigation sought and obtained an court order under the All Writs Act of 1789 ordering Apple to write a special version of its iOS to defeat two security features included in the operating system of Farook’s legacy 5c iPhone. In its request, the FBI was at pains to stress the supposedly limited nature of the cooperation sought: just a little bit of code to be downloaded one time on a single out-of-date phone owned by the government employer of a dead terrorist-murderer. How could that be unreasonable?

In this case I believe it is, but the decision is neither an easy one nor an absolute one for all cases. Rather, as I argue below, these type of requests by law enforcement should be judged on a reasonableness standard based on their unique facts and circumstances, with personal privacy and freedom of communication given a strong but rebuttable presumption.

The Farouk case is just one of many requests the FBI has made to Apple to assist it in recovering information believed to be stored on its phones or in the iCloud. The FBI has chosen well to litigate this matter as the Farook set of facts makes an attractive test case for them. While Apple has generally complied with such requests in the past, it has refused in this instance because it argues that in balancing public safety versus our right to privacy, acceding to FBI request in this case would set a dangerous precedent. In particular, Apple argues that it will be impossible to limit the “back door” the FBI seeks to only unlock the late terrorist’s phone without jeopardizing others. To understand why requires some additional background on Apple technology and criminal procedure.

What the FBI is asking Apple to do in the Farook case is to create a special version of the iOS (it has been dubbed “govtOS”) that once installed on Farook’s 5c will (i) allow an unlimited number of attempts to enter the four-digit passcode and (ii) eliminate the artificial and progressive delay that the standard iOS introduces after repeated failed attempts. This will allow the FBI to use what is known as a “brute force” technique to try up to all of the 10,000 possible four-key combinations needed to unlock the phone.

The FBI analogizes its request to serving a lawful search warrant on a landlord to gain access to a suspect’s apartment with a master key. I think a more apt analogy would be a request to the landlord to send a team of its handymen to replace an existing wall of the suspect’s apartment with a new fake wall that includes a secret doorway. The extraordinary nature of the request consists in forcing the recipient to build something it would not otherwise undertake and which, in fact, it believes would render the apartment (or phone) insecure. The secret doorway like the special iPhone back door would permit anyone less trustworthy than the FBI to access the contents of the apartment/phone.

As for the criminal procedure context, it is important to understand what would happen if and when the FBI gained access to Farook’s phone. In particular, to determine whether the FBI would be able to keep its promise that the special back door created for the Farook case would never become public or fall into dangerous hands. Since both prime suspects are dead, investigators probably want the phone to discover if there are accomplices who could be identified via information currently encrypted on the phone or if there is other information that would be useful in preventing future attacks or adding to our understanding of terrorist networks. There is no suggestion, however, that there exists a clear and present danger of another specific attack or lives in danger that makes access to the contents of Farook’s phone critical. This would, like in the First Amendment free speech context from which I borrow this talismanic words, convince me to overcome my presumption in favor of personal privacy and agree to the FBI request. Absent such an extreme threat, I assume the investigation might play out on the following lines: I imagine that via access to Farook’s phone, the FBI could discover, arrest and charge one or more other conspirators. What would happen next? The defense lawyers and their forensic experts would demand access to the rogue govtOS code to dispute that the system worked as alleged to implicate their client.  For example, they would be free to argue that the name of their client was inserted by the FBI via the special govtOS code and not present in the original memory. Thus, at least to me, it does not sound farfetched that the back door code would then be released and pass through many hands, exposing us all to cyber insecurity.

While we are on the subject of the law, I find it ironic that all of the candidates seeking the 2016 Republican nomination extol their steadfast dedication to the Second Amendment to the US Constitution (conferring a right to bear arms), but rush quickly to condemn Apple for refusing to comply with the magistrate’s ex parte order to build a back door into the iPhone. As I understand it, a major tenet of the conservatives’ insistence on the right to carry concealed weapons or automatic rifles is that this will protect Americans from the potential tyranny of their government. However, if my goal is protecting liberty, I find it far more effective to protect our right of free speech and freedom of assembly by allowing the use of encrypted communications than maintaining an arsenal at home that I can use against a drone-equipped national military. Hence my preference for a rebuttable presumption in favor of privacy, just as I support the right of Americans to own guns, just not AR 15 automatic weapons – Constitutional rights should not be absolute when they bump up against each other.

Finally, I believe there is one additional and decisive reason to support Apple in its refusal to create even “limited” back doors. The Internet is a great force for freedom and democracy based on its ability to connect us, remove friction and rapidly disseminate information. Like most technologies, it is also “dual use” and can facilitate terrorist coordination and cybercrime. No matter how noble the motives of the FBI in seeking to crack Syed Farook’s iPhone, repressive regimes around the world are watching and will happily order Apple and other American technology companies to write potentially more dangerous and intrusive “limited” instruments to facilitate their law enforcement efforts. They may do so even if US courts ultimately support Apple’s position; however, the United States should not abandon the moral high ground.

The City on a Hill should not leave the back door open.

Tom Glocer is the former CEO of Reuters.

  Like This Post
Share It:

    Original Post:

    Tags: ,

    One Comment

    1. Pingback: Apple, Back Doors and the City on a Hill | Bank Innovation | FintechLab

    Leave a Reply