Author: Susan Joy Paul, JobberTechTalk Writer
In February, Apple filed a motion to vacate a court order requiring the company to unlock an iPhone used by one of the shooters in the 2015 San Bernardino terrorist attack. That’s a lot of information to swallow – too much for the average consumer, which is why most of us just aren’t getting it. A terrorist’s phone that might lead us to more terrorists? Bust it open, Apple.
So why are so many companies coming to Apple’s defense in formal statements, blogs, and by filing amicus briefs – legal documents with arguments they want the court to consider in the case? Amazon, AT&T, eBay, Facebook, Google, Kickstarter, LinkedIn, Microsoft, Pinterest, Reddit, Snapchat, and Twitter are among the Apple supporters listed at Amicus Briefs in Support of Apple. There is also support from the American Civil Liberties Union, the United Nations, Human Rights Watch, and even Salihin Kondoker of San Bernardino, whose wife Anies was shot, but survived the attack. What do they know that we, Joe Consumer, don’t?
First, a brief history: Last December’s shooting left 14 people dead. San Bernardino County employee Syed Farook was one of two shooters. After the attack, the FBI found an Apple iPhone 5C owned by the county and used by Farook. They tried to unlock the phone to recover the data, which could contain information about other terrorists or attacks, but a security feature wipes out the data if there are too many unsuccessful password attempts.
The FBI met with Apple in January to discuss recovering the data from the phone or a backup. Unfortunately, shortly after acquiring the phone, the FBI asked the county to reset Farook’s iCloud account password, which prevented a backup of the most recent data.
In February, the FBI asked Apple to produce a new version of its operating system that would disable some of the security features. The FBI wants to install the new iOS as a “back door” into the phone, granting federal agents access. Apple refused, and the FBI asked a federal judge to issue a court order, then the U.S. Department of Justice filed an application requesting Apple create the software, install it on-site, and remove it when the FBI was done. Apple responded with a motion to vacate, asking the court to withdraw the court order.
[Read Apple’s public explanation of their decision in A Message to Our Customers posted February 16, 2016—Ed.]
This month, leaders from Congress met for a judiciary hearing and asked questions of both sides. The topic has been discussed at length in security, technology, and business media. There’s a lot at stake, and companies know it.
Apple is known for protecting the privacy of its consumers. Sacrificing that priority will cause irreparable damage to its brand. Even if we justify the FBI’s demands, opening the phone has the potential to compromise the security of all phones, and all consumer technology. Based on all the “secure data” on the Internet that was never supposed to be stolen, hacked, or leaked—like new movies, Social Security numbers, credit card information, medical records, and tax returns—we are not confident that a backdoor in iOS would be kept eternally confidential.
If Apple complies with the FBI, there will be more requests for access to phones for other federal cases, and from other agencies like state and local police. It’s only a matter of time before criminals and terrorist organizations get their hands on the technology, and use it to compromise our personal privacy, and our national security.
Finally, the government is setting a precedent by dictating product development. The implications of government—rather than free enterprise—deciding what companies develop, or even altering that development, should be apparent. Just a few years ago, the webmail company Lavabit suspended operations after the government ordered it to hand over private SSL keys that would have compromised customer emails. Companies create products that consumers want; what constitutes the government’s right to undermine that product development and consumer access to it?
Despite the facts, in the court of public opinion, the FBI may be winning. Presidential candidates use the case as a talking point to infuse public outrage with the threat of future attacks if Apple doesn’t comply. The national news stations air stories of cold cases, murders that could be solved if only the local police had access to a victim’s phone. Our knee-jerk reaction is to believe complying with the court order is the best course of action.
We have been trained to think like consumers. We care about instant gratification, and every television commercial, online advertisement and magazine slick targets that desire. In thirty seconds or less, we are made to smile and want that thing that makes us smile. Instant gratification, instant technology, instant security. Open that iPhone, Apple. Make us safe again.
Maybe it’s a good thing this is happening to Apple, a company with the means to fight it. This is also an opportunity to educate the public about why the outcome should matter to us not just as consumers, but as citizens. Somebody needs to save us from ourselves.
Security is a tough sell to the average phone user, especially when we give up our personal information so readily every time we log in to a site with our favorite social or professional media account, allow our location to be tracked by GPS, and let apps use the information on our phones to “provide services” – basically, mine all our personal information so they can market to us.
Apple might consider training us to be protectors of our own privacy with the same fervor they use to train us to be enthusiastic consumers. The learning curve will be long and steep, but the education is crucial if they expect us to stand behind them. The Patriot Act and the N.S.A. brainwashed many of us into believing that being safe means compromising our privacy, they wore down our resistance, and since the repercussions aren’t immediately apparent and haven’t affected our day-to-day lives, we’ve already forgotten what we sacrificed.
Is educating the consumer Apple’s responsibility? Yes, if they want us to trust their decisions are in our best interests. We need an easy-to-grasp explanation of why this matters to us, as consumers and as citizens. Make it real for us, Apple. Give us thirty seconds of why letting the government dictate your product development, force you to create a back door to our personal data, and compromise our privacy is a bad idea. Because all we see right now are Apple lawyers in suits and ties, crying moms who want their daughters’ murderers found, and scary politicians and terrorists, and we consumers want a quick fix – we want to feel safe again. Cracking the iPhone is something I never want to see happen, but it may be easier than cracking that mindset.