Core Values: Why Apple Is Right in Refusing to Hack an iPhone for the FBI
March 09, 2016
On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like “1984.”
Those were the closing lines of Apple Computer’s groundbreaking (and screen-shattering) advertisement during the 1984 Super Bowl for its new computer, the Macintosh. The allusion, of course, was to George Orwell's equally groundbreaking dystopian novel Nineteen Eighty-Four, a book of such disturbing but credible forecast that it turned the author’s name into a veritable synonym for pervasive surveillance.
Apple was not aiming its sledgehammer of an ad at the Big Brother of government but instead at what it perceived to be the mindless lockstep adoption of the IBM PC and its clones (it was no accident that the ad’s “Big Brother” bore at least a vague resemblance to Microsoft’s then-president, Bill Gates). For Apple, the great battle in 1984 was against the forces of grinding conformity rather than the threat of excessive or invasive governmental searches.
Nonetheless, given Apple’s invocation of Orwell, it is appropriate that the company now finds itself, thirty-two years later, at the forefront of an intense battle over personal privacy, the encryption of smartphone data, and national security. In the wake of the horrific mass shooting in San Bernardino in December 2015, federal authorities seized an iPhone 5C issued by the San Bernardino County Department of Public Health to Syed Rizwan Farook (one of the two assailants), who worked at the Department as a food inspector. Apple provided investigators access to the data Farook had backed up to the company’s iCloud servers, but the most recent data from the 5C had not been backed up.
As is the case with all later-model iPhones, the data on Farook’s 5C is encrypted and can only be accessed by entering the correct passcode to unlock the phone. In theory, the FBI could test all of the iPhone 5C’s 10,000 possible combinations in a matter of minutes. However, the Apple iOS is designed to wipe a phone’s encryption keys if more than ten incorrect guesses are entered. The FBI asked Apple to write special software (basically, a replacement iOS) that would allow the agency to enter as many guesses as are needed to open the phone. When Apple refused to do so, the FBI obtained a court order instructing Apple to provide the necessary technical assistance. Apple reiterated its refusal and vowed to appeal the order. A number of other major tech companies—Google, Facebook, Microsoft, Evernote, Dropbox, etc.—have announced support for Apple’s position and their intention to file an amicus brief opposing the government.
Ironies abound, of course. Every single tech company involved in this matter, from Apple to Snapchat(!), has to one degree or another actively harvested vast amounts of highly personal data from its customers, often surreptitiously. (In 2011, for instance, Apple was flagged for storing more than a year’s worth of data about each iPhone user’s location in an unsecured file on the phone). More insidiously, many if not most of Apple’s allies have business models that are explicitly built on persuading consumers to voluntarily share personal information with the company, their friends, and a significant chunk of the world’s population. And of course, we enthusiastically do so. Every day, we post billions of photos, updates, likes, videos, etc., all of which provide enormous fodder to advertisers and investigators alike.
Given the fact that most of us shuffle through life stirring up an easily-collected cloud of personal digital data, much like the Peanuts character Pig-Pen, why has Apple decided to draw this particular line in the silicon? The answer, I think, is a combination of philosophy, politics, and the personal.
During a live interview at last week’s SXSW convention in Austin, TX, President Obama acknowledged the concern that many people feel over easier governmental access to digital data: “[T]here are very real reasons why we want to make sure that government cannot just willy-nilly get into everybody’s iPhones that is full of—or smartphones that are full of very personal information and very personal data. And let’s face it, the whole Snowden disclosure episode elevated people’s suspicions of this.”
At the same time, however, the President made it clear that he has no patience with digital privacy absolutists. If encryption grows too powerful, he argued, it would become much more difficult, if not impossible, to investigate and arrest child pornographers, terrorists, embezzlers, tax cheats, etc.
“So if your argument is strong encryption, no matter what, and we can and should, in fact, create black boxes, then that I think does not strike the kind of balance that we have lived with for 200, 300 years. And it’s fetishizing our phones above every other value. And that can’t be the right answer. I suspect that the answer is going to come down to how do we create a system where the encryption is as strong as possible, the key is as secure as possible, it is accessible by the smallest number of people possible for a subset of issues that we agree are important.”
As is typical for our Law Professor-in-Chief, President Obama’s position is both reasoned and apparently reasonable. But it is fair to question whether technology in general and mobile devices in particular have already shifted the balance between investigation and privacy; we store so much information on our personal devices (and through them, in the cloud) that access to those devices imparts a truly awesome power to the person who has gained access. While I am glad that the President is “way on the civil liberties side of this thing,” we need to keep in mind that people suffer harm from misused personal data every single day, while the types of crimes highlighted by the President are fortunately less frequent and routinely are solved without the type of backdoor decryption powers sought by the FBI.
As Apple CEO Tim Cook wrote to the company’s customers: “Make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor.” As compelling as the facts of this case are, Apple recognizes that the FBI’s request is the proverbial nose of a very large camel. And it is almost certainly correct. If Apple is compelled to hack its own iOS, it will be flooded with requests from law enforcement agencies around the United States. Prosecutors and investigators have made it clear that if Apple is ultimately required to hack the San Bernardino iPhone, they will seek similar orders demanding that Apple pry open dozens or hundreds of other seized iPhones. Federal law enforcement officials have also lobbied Congress repeatedly for a permanent “backdoor” that would enable investigators armed with a warrant to bypass a cellphone user’s passcode without getting tech company assistance.
The significance of this battle cannot be overstated. There are few devices known to man that are more ruthlessly efficient in collecting, storing, and distributing personal information than a smartphone. The average smartphone contains so much personal information, in fact, that the U.S. Supreme Court recently ruled unanimously that police must obtain a search warrant before examining the contents of a seized phone. As Chief Justice Roberts noted:
“[A] cell phone search would typically expose to the government far more than the most exhaustive search of a house: A phone not only contains in digital form many sensitive records previously found in the home; it also contains a broad array of private information never found in a home in any form—unless the phone is.”
As intrusive as data collection by private companies can be, the negative consequences (unwanted ads, commercial profiling, even credit redlining) pale in comparison to government power over our property, our liberty, and even our lives. As I wrote in American Privacy, we don’t have to look far back in our nation’s history to find instances of government misuse of personal information. Nixon, with his enemies list and abusive IRS practices, is the most well-known example, but similar abuses have flared up at all levels of government. (Among other things, there are numerous reports of investigating officers downloading and sharing nude photos and videos that they discovered while examining seized cellphones.)
No one with the personal life experiences of Tim Cook can be sanguine about the risks of easier government access to the increasingly large amounts of personal information that we carry around with us every day. That’s particularly true given the appalling possibility of the election of Donald Trump, a businessman-turned-demagogue who has demonized one group after another. Just imagine, for instance, the demands a Trump administration would make to tech companies for information that could help it round up and deport every illegal alien (or suspected illegal alien or people suspected of helping suspected aliens, and so on).
When it comes to personal privacy, there is a great deal that tech companies could and should do better. For instance, I prefer the European model of opt-in when it comes to the collection and redistribution of data, as opposed to the more common U.S. approach of opt-out. Privacy settings should be simpler and easier to use. I also still support the creation of a Federal Privacy Protection Agency to establish best practices, to levy fines where appropriate, and to serve as an advocate for consumers. But every one of us should applaud Apple and its allies for their willingness to spend their time and money (even at the risk of a Trumpian boycott) to help maintain a proper balance between the government and its citizens in these data-driven times.
Update: President Obama spoke on March 11 at the South by Southwest festival in Austin, TX to comment on his opposition to the stance on encryption taken by technology companies such as Apple. As of March 15, the author has updated this post to reflect this.
About the Author
A graduate of Amherst College and Boston College Law School, Frederick S. Lane is an author, attorney, and lecturer. He has written seven books on the impact of emerging technologies on law, society, and culture. Two of those books have been published by Beacon Press: American Privacy and The Court and the Cross. After 23 years in Vermont, Lane moved to Brooklyn, NY in 2013 with his wife, Dr. Amy Werbel. Follow him on Twitter at @fsl3.