CITIZENS have a right to both security and privacy. The difficulties arise when these two rights are in conflict, as they now are in the battle between the world’s most valuable company and its most famous law-enforcement agency. Apple has refused to comply with a court order to help the FBI unlock an iPhone used by Syed Farook, one of the terrorists involved in the San Bernardino shootings in December. The company says the government’s request fundamentally compromises the privacy of its users; the feds say that Apple’s defiance jeopardises the safety of Americans (see article).
Some frame the stand-off in terms of the rule of law: Apple cannot pick and choose which rules it will obey, they say. That is both true and beside the point. The firm has the right to appeal against a court order; if it eventually loses the legal battle, it will have to comply. The real question is whether Apple’s substantive arguments are right. That hinges on two issues.
The first is whether the FBI’s request sets a precedent. The law-enforcers say not. This is not an attempt to build a generic flaw in Apple’s encryption, through which government can walk as needed. It is a request to unlock a specific device, akin to wiretapping a single phone line. The phone belonged to a government department, not Farook. Apple and other tech firms regularly co-operate with the authorities on criminal cases; this is no different. Yet Apple is being asked to do something new: to write a piece of software that does not currently exist in order to sidestep an iPhone feature that erases data after ten unsuccessful password attempts. Later models of the iPhone than the one Farook used are harder to compromise in this way. But if the court’s ruling is upheld, it signals that companies can be compelled by the state to write new operating instructions for their devices. That breaks new ground.
The second issue is whether that precedent is justified. And that entails a judgment on whether security would be enhanced or weakened by Apple’s compliance. In the short term, the answer is that security will be enhanced. Farook was a terrorist; his phone is the only one being unlocked; and the device might give up the identity of other malefactors. But in the longer term, things are much fuzzier.
Security does not just mean protecting people from terrorism, but also warding off the threat of rogue espionage agencies, cybercriminals and enemy governments. If Apple writes a new piece of software that could circumvent its password systems on one phone, that software could fall into the hands of hackers and be modified to unlock other devices. If the capability to unlock iPhones exists, so will the temptation for the authorities to use it repeatedly. And if tech firms are forced to comply with this sort of request in America, it is harder for anyone to argue against similar demands from more repressive governments, such as China’s. This newspaper has long argued against cryptographic backdoors and skeleton keys on these grounds. It is possible to imagine a scenario that might override such concerns: if information is needed to avert a specific and imminent threat to many lives, for example. But in this instance, Apple’s case is the stronger.
Core arguments
This battle presages others. If the courts rule against Apple, it will work to make its devices so secure that they cannot be overridden by any updates. In that event (or, indeed, if the tech firm wins the Farook case), legislators will be tempted to mandate backdoor access via the statute book. If Tim Cook, Apple’s boss, is not to hasten the outcome he wishes to avoid, he must lay out the safeguards that would have persuaded the firm to accede to the FBI’s request. Tech firms are at the centre of a vital policy debate (see article). Apple has rejected the authorities’ solution. Now it must propose its own.