The FBI wants to get inside a terrorist’s iPhone. Apple is challenging this order. Who’s right? Turns out this is a much more difficult question than I originally thought.
When I read Apple CEO Tim Cook’s open letter to customers my first reaction was “Go Apple! You’re absolutely right to resist this order, which is a clear and blatant overreach by the U.S. government". I thought, no way should Apple be forced to build a backdoor to the iPhone. It would be like opening Pandora’s box.
Backdoor is of course a strong word and it’s no coincidence that Tim Cook chose to use it. One could argue that technically the FBI is not asking for a backdoor. What they're asking for is a change to one particular iPhone’s operating system (iOS) so that it doesn’t erase all phone data after 10 incorrect passcode attempts. The FBI would then use a simple brute-force cracking method to go through all possible passcode alternatives and gain access to the iPhone.
Furthermore, as New York Times pointed out in their op-ed article, the FBI is in fact approaching Apple through the Front Door. They’re going through the courts and motivating clearly and publicly why they want to access this particular iPhone. The motivation is a pretty good one. There might be information on this iPhone, which could reveal the San Bernardino terrorists' contacts. On the other hand, there might be nothing on the iPhone. In fact, the San Bernardino Police Chief thinks there's a reasonably good chance that there is nothing of any value on the phone.
Apple fears this case would set a dangerous precedent. Soon other cases would emerge and I presume it wouldn’t take long before Russia and China came with their own “requests”. Apple’s lawyer pointed out that FBI Director James Comey himself testified for Congress that there are 12-13 other iPhones they’d like to access. The Manhattan District Attorney said they have 205 iPhones they’d like to access. And this is just one jurisdiction we’re talking about!
Apple also points out that they have a responsibility to millions of customers around the world. Customers who have bought a product they thought was safe against hackers, cybercriminals, and authoritarian regimes. Customers who don’t want hackers or government spies to be able to intercept their messages, access health records or financial data, track their location, or access the phone’s microphone or camera without their knowledge.
Is there any situation where you’d want Apple to comply with FBI’s order? Certainly! Here’s an easy example: there’s an imminent terrorist nuclear threat to a major city and we know there is information on an iPhone abut the threat. In this hypothetical scenario, who wouldn’t want to see the phone being opened up by government agencies?
But how serious should the criminal case or the terrorist threat be? Where do we draw the line? Who makes the decisions? Should they be made on a case-by-case basis? How high should the probability be for the phone to actually contain the information we suspect it contains?
To summarize, this dispute is much more difficult than I originally thought. The facts Edward Snowden leaked to the world doesn’t make things easier either. The U.S. and U.K. governments (and in particular their signal intelligence agencies NSA and GCHQ) have shown us how dangerously easy it is for them to cross the line. How our basic rules and rights are conveniently forgotten when we're waging war against terrorism.
Nevertheless, somebody needs to soon decide on the Apple vs. FBI case. I seems like the task will fall on the U.S. Congress. I just hope they consider all aspects and consequences before they force Apple to comply with the FBI order, because believe me – that'll be the outcome of this. At the very least there should be strict controls and very high barriers for potential future cases.