Last December, Syed Rizwan Farook and his wife murdered 14 of his San Bernadino County Dept. of Public Health co-workers at a holiday party, and wounded 21 others. After leaving crude explosive devices behind in hopes of killing emergency responders, the two Islamic terrorists fled, and were themselves killed in a gunfight with pursuing law enforcement officers.
The subsequent investigation turned up lots of physical, financial, and computer evidence, but the FBI was unable to get into Farook’s encrypted iPhone 5C. Last month, they demanded, through a U.S. Magistrate Judge, that Apple, Inc. create a custom iPhone operating system, one with all security features disabled, which the FBI would use to recover all the data on the phone. There’s lots of good reporting on this case, so I’m mostly going to talk about the attitudes involved in the ongoing debate about privacy, encryption, and surveillance.
Despite FBI Director James Comey’s previous statements that Farook and his wife were not part of a larger terrorist cell, but had “self-radicalized” using freely available internet material, the need to get into the phone is described as of utmost importance. Despite former Counterterrorism Chair Richard Clarke’s observation that the NSA could easily crack the phone, the FBI demanded that the entire weight of Federal authority instead be used to compel Apple to create a reusable phone-breaking tool.
It’s obvious that creating legal precedent was a bigger goal for the FBI than seeing inside the phone of a months-dead Jihadi. The ability of the U.S. government to request private companies to assist in investigations is one thing, but the ability to force private companies to use their proprietary information to create custom-tailored products for the government (and for free) is another. With this legal precedent, would a police department be able to demand a dozen new Corvettes to be donated and smashed to help solve a drunk driving case? Would the DEA be able to rule that Chevrolet install drug sniffing devices in all of its cars? Would Federal courts then punish Chevrolet for being physically unable to create the magical drug-sniffer that the DEA had dreamed up?
But let’s ignore that slippery slope for a moment, and get back to encryption. The FBI-Apple case brought a lot of opinions out into the light, and the most common statement that came up was an idea that encryption is a new frontier that changes the world. That this application of simple math is a game-changing technology that needs careful thought because it alters everything about how we have historically traded information. Even President Obama made this point at Austin’s South by Southwest festival:
“If technologically it is possible to make an impenetrable device or system, where the encryption is so strong there’s no key, there’s no door at all, then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? If in fact you can’t crack that at all, government can’t get in, then everybody’s walking around with a Swiss bank account in their pocket.” The President went on, “If your argument is strong encryption no matter what… that I think does not strike the kind of balance we have lived with for 200, 300 years.”
But historically speaking, the kind of unthinkable privacy that Mr. President is discussing was the norm for the last 300 years (and longer). Swiss bank accounts, just for illustration, have been famous for their security since the 1700s, and only recently became a horrifying worst-case-scenario illustration for a sitting President. The idea that governments must be permitted instant and total access to financial information is a very recent one, and the U.S. government wasn’t even allowed to see into U.S. banks until the Bank Secrecy Act of 1970.
And while our President accused encryption advocates of “fetishizing our phones,” the argument isn’t really about phones. My phone is the just the device that currently contains (or, more accurately, displays) my private correspondence. The arguments for and against encryption have more to do with personal information than devices. In the past, governments have always had the ability to pursue information, and this has always been hard work.
Prior to the digital revolution, a wiretap warrant meant that a physical device had to be installed on a physical phone line, and then physical human beings had to take the time to carefully listen to all the conversations on that phone line. Mail could be intercepted and read, but again, only by a human. If a suspect’s location needed to be tracked, a human detective could follow him. This has been true in both free nations and totalitarian regimes.
A government’s powers of surveillance have, in the past, always been limited by the number of investigators or spies available. They have also been limited by the temporary nature of information; it was impossible to retrieve conversations held before the wiretap was installed, for example. It was also difficult to legally misuse these powers, because any misuse would be witnessed by all the humans that had to be involved. Investigations, therefore, have always been prioritized by the greatest crimes or the greatest threats.
But today, things are different. Ten years of all my emails are sitting on a Google server, including backups of all the messages that I have deleted. A huge backlog of my phone calls are being stored by AT&T, as well as in-depth logs of my internet usage. Everywhere that my phone has ever connected to a cell phone tower can be shown on a map with just a few keystrokes. This amount of information, and the ease of access that governments have to it is what’s new.
And, more importantly, all of this data can be searched and stacked and collated by powerful algorithms. A computer can make you the suspect of crimes based on ancient emails and locational proximity. Perhaps you haven’t committed a crime yet, but Facebook has already profiled you onto a watchlist, and you just bought a pressure cooker on Amazon. Evidence that judges used to exclude for being “circumstantial” now launches investigations, investigations that are usually prioritized around low-hanging fruit rather than highest threats.
This is the big game-changing unknown territory, not encryption. Encrypting our emails and our phone calls just gets us back to the basic privacy that we enjoyed before the 1990s, before massive server storage and big-data analysis showed up. It gets us back to a time where investigators need to carefully pick how they want to spend limited decryption resources rather than simply seeing everything, everywhere, all the time. And various government agencies from various countries don’t like that. They have really enjoyed total access, and they want to hang on to it.
Which is why the FBI wanted Apple to make them a phone-breaking tool, and why they wanted a legal precedent that could be used to compel other private companies to give them the kind of access to which they have grown accustomed. As the court date approached, however, the FBI chickened out. At the last minute, they claimed to have found another way to get into the phone, probably because their legal analysts weren’t confident of a win.
But even though the case was dropped, we, the people, didn’t really win either. While James Comey assures us that this case was only about one phone and one terrorist, the Justice Department has promised that it will continue to use the courts to force the private sector to defeat encryption. So the next time this question comes up, remember that the underlying issues are not related to terrorism, crime, or even cryptography, but basic questions about freedom, privacy, and limited government.