This story may be about Apple and you may hate the very fabric they were woven from, but it’s an important issue to anyone who values digital privacy and security in this day and age. Shortly after FBI director James Comey complained to congress about their inability to get into a San Bernardino terrorist’s phone, a new court order has been issued to Apple to aid in unlocking a smartphone submitted into evidence as part of that case.
Read: Google CEO’s response to Apple’s stance on FBI request
Apple’s initial response to that court filing was posted as an open letter from Tim Cook to customers on their website, and it was there they noted that the FBI had essentially asked them to create a “backdoor” for their phones.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
Apple refused the request, an act that spawned a sizable protest and show of support in front of a San Francisco Apple Store, as well as waves of conversation online.
But there’s a bit more to the story than that. For starters, the court doesn’t exactly ask for the type of “backdoor” that you may be thinking of — the company won’t create a mechanism for the FBI to gain access to your device anytime they want, for instance.
Instead, Apple was asked to issue software that would disable iPhone security measures like wiping data after 10 incorrect password attempts. The software (which doesn’t currently exist) would have to be manually loaded onto a phone that has to be in possession of whoever is using the software. This would allow the FBI to attempt a brute force attack (using as many different combinations letters or numbers as possible) on the phone without fear that the phone would completely wipe the data after just 10 tries.
Although the FBI is only asking for the solution for use in this one case, it would set a precedent that no one wants. Apple’s biggest fear is that creating the software could lead to a dangerous future. Someone could leak the software into the wild, and potentially give anyone the ability to circumvent iPhone security as it exists now. It also sets precedence for the FBI to make more requests like these in the future.
To be clear, Apple doesn’t have an issue working with law enforcement to fulfill lawful requests, but the way they build software now makes it impossible to fulfill those requests. It’s when they’re ordered to actively do business and build software in a way that exposes their customers’ information, privacy and security that they have an issue.
And that’s why Apple took the bold stance to outright oppose the order. Tim Cook isn’t alone in his public outcry to protect privacy, either. Here are comments from other notable names from the tech industry on this growing issue:
Important post by @tim_cook. Forcing companies to enable hacking could compromise users’ privacy. We know that law enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism. We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders. But that’s wholly different than requiring companies to enable hacking of customer devices and data. Could be a troubling precedent.
I have always admired Tim Cook for his stance on privacy and Apple’s efforts to protect user data and couldn’t agree more with everything said in their Customer Letter today. We must not allow this dangerous precedent to be set. Today our freedom and our liberty is at stake.
Not everyone is of the same opinion, though. On the other side of the fence stand representatives of the US government, inclduing Republican presidential candidates who will likely use cyber security as an important platform to augment their campaigns. The consensus? Apple needs to fall in line and follow the order.
I agree 100% with the courts. In that case we should open [the iPhone] up. We have to use our heads, we have to use common sense. We have to be very careful. We have to remain very diligent. To think that Apple won’t allow us to get into [the terrorist’s] cell phone, who do they think they are?
Ultimately, I think being a good corporate citizen is important.
Court orders are not optional and Apple should comply. In this case, under a valid court order, Apple has been asked by the FBI to unlock a government owned cell phone to assist in the investigation of a terror attack that killed 14 Americans.
This is a very important issue that isn’t going to go away quietly. This will define a very important aspect of cyber security and privacy in a future where connected devices are only going to get more and more popular, diverse and accessible. Kudos to companies like Apple and Google who have enough courage and honor to stand up for the basic civil rights owed to every citizen.
At the same time, we do understand where the government is coming from. Privacy and security might pale in comparison to potentially saving human lives or preventing further horrific terrorist attacks that have major ramifications on a city, state or entire country.
But we want to hear from you: where do you stand? Should the government be allowed to issue orders that force companies like Apple to actively compromise user security and privacy? Or should they be more willing to help governments do everything in their power to prosecute terrorists and criminals for a safer future? Let’s hear it below!