On December 2, 2015 Syed Farook and his wife Tashfeen Malik carried out a deadly mass shooting at Farook’s workplace in San Bernardino, California. The attack, which was likely inspired by extreme anti-western views, has been labeled the “worst terrorist attack on American soil since Sept. 11.” Farook and Malik were killed the same day in a shootout with police, leaving investigators to gather and piece together evidence about the attack without ever having the opportunity to question the perpetrators.
The FBI now claims it would like to access data on Syed Farook’s Apple iPhone which it believes could be valuable for thwarting future terrorist attacks and learning more about the motives and planning behind the couple’s deadly rampage. However, the FBI may also be trying to set a precedent which will ensure tech companies compliance in handing over data, or the tools to access data, in the future. On February 16, 2016 a Federal Judge in California sided with the FBI and issued an order to apple that it “assist in enabling the search” of Farook’s phone.
This is not the first tussle between the FBI and Apple over how complicit the company must be in assisting the FBI to gather information from its customers. In 2014, Apple strengthened privacy and security features for iPhones running with newer operating systems. Formerly, Apple had the capability to hack into a person’s iPhone if it was handed over by law enforcement. With the new phones, however, Apple said it would no longer be able to do this. With the newer phones, creating a PIN to access the phone encrypts data on the phone using two tiered encryption where the PIN unlocks the key which in turn unlocks the data.
Unfortunately for the FBI, they can’t simply keep guessing PINs until they get it right. The iPhone that Farook used, the 5c, will automatically wipe the data from the phone after ten incorrect PIN entries. Then there is a delay between PIN entries, so that even a super computer might take 5.5 years to crack a six digit PIN. A combination of numbers and letter could take much longer. Furthermore, the phone is designed so that PINs must be entered directly on the phone, meaning connecting the phone to a computer might be an issue.
These new security features are important to many customers in the wake of the 2013 Snowden revelations, where documents revealed that some tech companies where complicit in spying on the American people. Apple was in a position to show that it took privacy seriously, and did so by upgrading the protections on its devices. But what this means for the FBI is that there will be situations where a device has evidence, the FBI has a court order, and they still will not be able to access the data.
Which is exactly the situation the FBI says it has in the case of Farook’s phone. Therefore, the FBI requested, and a judge ordered Apple to do three things:
(1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.
Essentially the FBI is asking Apple to write new software to defeat its own security measures. Unmoved by the government’s arguments, Apple has since publically stated that it will not comply with the order.
What is extremely interesting, and perhaps most telling, is how each side has publically addressed the situation. While it is true that there is an actual phone and an actual violent plot at the background of all this, the fight between the Feds and the tech giant is really about a much larger ongoing debate. This is a perfect case to study various arguments for enhanced privacy, versus the arguments for increased access by law enforcement, thus enhancing security of the public at large.
From Apple’s perspective, CEO Tim Cook has said from the beginning that this is about privacy. But while that may appeal to Apple customers and privacy advocates, the general public may not agree that the FBI should not be able to look through someone’s phone if they have a court order. Maybe not surprisingly, Apple has pivoted its stance to one that raises concerns about security.
If Apple creates a backdoor to encryption, it says, it will only be a matter of time before the ability to hack devices falls into the wrong hands, such as criminals or foreign bad actors.
Once that happens every device will be vulnerable. Therefore, Apple argues, breaking into this one phone will have consequences that far outweigh the reward.
For the FBI’s part, they have carefully tried to frame the issue to make it as much about this one case as possible. The shooting is a sensitive subject and there is great public support to fight perceived terrorist threats. However, if the FBI wins this battle it will have set a precedent it no doubt would like to exploit in the future. As Mike Masnick of Techdirt reminds, “after the Intelligence Community lost the fight to get a law banning strong encryption, intelligence officials said out loud that they’d just wait until the next terrorist attack.” The political environment surrounding a serious attack such as the San Bernardino shooting makes the security argument the FBI is pushing much more favorable. There is even doubt as to whether the FBI really needs the data it can access by breaking through the Encryption on Farook’s phone. And the FBI’s own actions may be responsible for locking itself out in this case.
Whatever one thinks of the merits of each sides’ arguments. This is a case that should be paid close attention to because it involves important questions about the balance of personal privacy and collective security. And also whether or not a company should be compelled to help the government break into its own product. Interestingly, even if the government wins this round, privacy advocates are already one step ahead coming up with ways to protect their data from a government hack. If nothing else peaks ones interest in this case, it is worth noting the irony that the FBI’s legal argument on why it has the authority to beat encryption is based on a law that is over 200 years old. Proving once again that in the fast paced twenty-first century of a new iPhone seemingly every year, the pace of the law is as cautious and plodding as ever.