Apple Needs to Cooperate with the FBI to Save Lives

0 , , Permalink

There’s a hoary legal maxim that hard cases make bad law and, at first glance, the standoff between Apple and federal investigators probing the San Bernardino terror attack certainly would seem to be one of those.

The basic facts are these: Syed Rizwan Farook, who along with his wife, Tasheen Malik, killed 14 and wounded 22 in a terrorist attack on a San Bernardino government health center. They had an iPhone 5c, which was recovered from the couple’s SUV after they were killed in a shootout with police. On Oct 19—44 days before the shooting—Farook activated an encryption option on the phone, which now prevents the FBI from determining how it might have been used in the weeks before the massacre—and with whom.

Over the past weeks, representatives of the Department of Justice and some of President Barack Obama’s top intelligence advisers have been in Silicon Valley, meeting with Apple executives trying to win their agreement to assist in opening the phone to law enforcement’s inspection. The encryption coding written by Apple allows just 10 tries with the wrong entry code before the phone irreversibly erases all the data it contains, which in this case might include Farook’s notes, contacts, text messages and video.

Investigators believe the phone may contain information vital to understanding more about the worst domestic terrorist incident since 9/11 and, perhaps, clues to other attacks still in the works. When the talks with Apple broke down, the government obtained a federal court order, instructing the company to write new code that would allow the FBI access to Farook’s phone. Apple doesn’t disagree about what may be on the device, but its CEO, Timothy C. Cook, says the company will not comply with the court order because its customers have a right to use their phones in privacy and the new code inevitably will fall into the hands of hackers. For that reason, Cook says Apple will appeal the order.

Does Apple really want a world in which every terrorist’s two essential pieces of equipment are an AK-47 and an iPhone?

At first blush, this would seem a classic governmental power versus civil liberties confrontation. And, on that basis, Apple already has won. No matter what it ultimately is compelled to do, the company already has become an heroic champion of digital privacy rights in the eyes of many. That said, as a civil libertarian whose instinctive reflex always is to insist on the greatest degree of individual privacy possible, I think that DOJ has the better argument in this case, though we ought to be extremely cautious in how widely it’s applied. There are, moreover, a number of commonplace facts that should be taken into consideration here along with the novel ones this case presents.

First, there is no area of our lives — outside the confessional — where our expectation of privacy is absolute. With the proper warrants, the police can search our homes, our cars, our financial records and, in some cases, our persons. Why should telephonic activity be granted greater privacy than our homes and bodies whose intrinsic sanctity is enshrined in the deepest reaches of the common law? If the government can proscribe the construction of an uncrackable safe—as it does—why not forbid an uncrackable smart phone? How is what the government is asking now really any different than a telephone tap carried out under legal warrant? How is it difference from a warrant allowing investigators to read your mail?

Second, the phone at issue didn’t belong to Farook. It is owned by San Bernardino County, which issued it to him in his capacity as a health inspector. The county has said it wants the phone opened to FBI inspection. When Farook received the phone, it was set to store all its data in Apple’s iCloud, which is accessible to legal subpoena. He disabled that setting in favor of the encryption option and without the consent of his employer, the phone’s legal owner. Why should the assumed right to privacy of a rogue employee trump that of the device’s actual owner?

For the sake of argument, though, let’s assume that the digital world is different than all that have gone before and that this case actually poses questions of consequence never before considered on their own merits. Fair enough: Apple is being asked to do something novel. The code the government needs to get into Farook’s phone does not exist and only the company’s people can write it. All programming code for the device’s iOS operating system must bear the encrypted digital signature of the person who wrote the code. Only Apple employees have such signatures. Therefore, Apple’s code writers would be compelled to create something that doesn’t now exist, rather than simply to turn over to the government something being withheld.

Just to complicate matters still further, many digital adepts consider code a form of speech.

As WordPress co-founder Matt Mullenweg, who thinks deeply and knows a bit about these things, has said, “Code is poetry.” And so it may be in this new digital landscape.

If financial contributions are a form of speech, as the Supreme Court ruled in the Citizens United case, then why not code?

To my knowledge, no court order ever has compelled into being creative speech that was not already in existence. If you find this line of thought appealing, then a victory for the government would be novel, indeed—and troubling in its wider implications.

This entire encryption controversy—and Apple simply is the foremost of the tech giants arrayed against the government—arose out of the general embarrassment that tech and phone companies suffered when Edward Snowden’s revelations about government spying revealed the degree of their cooperation with the NSA and other intelligence agencies. In 2014, Apple rewrote its smart phone code to make its customers’ privacy theoretically inviolable. Since then, the company—America’s largest—has staked its global reputation on two pillars: privacy and beautifully designed hardware.

Both are respectable attributes, the problem arises in the way Apple conceives privacy. Like other tech companies it, in fact, collects an almost unimaginable amount of data—casual and deeply intimate—on its customers. Take a look at what the agreement to its Siri function entitles the company to do and you’ll get the idea—basically Apple is empowered to know everything about you. At the same time, the company has been slow and grudging in addressing the widely available information young hackers have posted online about how to break into Siri-enabled devices.

In other words, Apple appears to believe that its customers have a right to privacy from scrutiny by anyone but them.

Then there’s the question of whose privacy Apple really is guarding. Most of its iPhones are sold outside the United States and China is its biggest market. Cook may reasonably fear that that if Apple agrees to write code to give the FBI access to Farook’s phone, Beijing’s state security agencies soon may ask for similar code to monitor the phones of Chinese dissidents. That’s a reasonable anxiety, but the immediate question at issue has to do with the physical security of Americans. It may be that prudent Chinese democrats and reforms will cease using smart phones, if they want to shield themselves from authoritarian pressures. That would be a great inconvenience to them and a commercial blow to Apple. That, however, is a problem to be dealt with by Chinese. We ought to rely on John Quincey Adams’ sentiment that we are “the friends of liberty everywhere, but the custodians of none but our own.”

Does Apple really want a world in which every terrorist’s two essential pieces of equipment are an AK-47 and an iPhone?

In this case, there can be no liberty without life and what the U.S. government is about here is the preservation of American lives. That ought to be dispositive in this instance. Apple needs to be a good American citizen, as well as good digital one and cooperate with the FBI.

Comments are closed.