Did FBI v. Apple – the most high-profile conflict to date over government access to private, encrypted data – change the world of digital surveillance and personal privacy?
With the first anniversary of the court fight over unlocking a terrorist’s iPhone approaching, the answer both from evidence and various stakeholders is: Not much – yet.
According to experts, it did not significantly increase law enforcement surveillance, but certainly didn’t curb it. It did not lead to any major legal precedents, either through legislation or Supreme Court rulings.
Some of those may be coming, however, possibly this year. Congressional committees have studied and reported on the issue. There is draft legislation focused on it in the works. There are ongoing conflicts – legal, legislative and philosophical – over whether forcing private firms to grant government access to data for criminal investigations or surveillance can be done without eroding personal privacy and civil rights.
Of course there is the arrival of President Donald Trump, who has not promised any executive orders on the matter, but did famously call for a boycott of Apple when the company refused to comply with the FBI demand.
And there is also the march of technology – as a number of experts have pointed out, there are dozens of messaging apps that use strong encryption, which could make creating a backdoor to unlock a phone somewhat irrelevant.
But at a minimum, the case intensified the debate over whether there is a way for competing privacy and government public safety interests to coexist.
John Verdi, vice president of policy, Future of Privacy Forum
But, 43 days after the original court order, the FBI withdrew its complaint, saying it had hired a vendor that was able to break into the phone. The agency refused to name the vendor or what method had been used to hack the phone.
The obvious reason FBI v. Apple didn't set any precedents is that it never got legally resolved. Just before a scheduled hearing on the case, the FBI withdrew, saying it had found a vendor that was able to unlock the phone.
And in the view of some privacy advocates, that’s unfortunate, because they think the FBI would have lost. Nate Cardozo, senior staff attorney with the Electronic Freedom Foundation (EFF), who called the FBI complaint “wild overreach,” is one of them. He said the FBI knew its case was weak.
“The fact that the government pulled the plug on the litigation on the literal eve of the hearing speaks volumes about the strength of its legal argument. Instead of risking binding precedent on our side, the government blinked,” he said.
That is also the view of Greg Nojeim, director of the Freedom, Security and Technology Project at the Center for Democracy & Technology (CDT).
“The fact that the government was able to access the content it sought without Apple’s assistance undermined its argument that it needed to be able to compel Apple and other providers for the assistance they sought,” he said.
A bit of background
By now, the basics of the case are well known: On Dec. 1, 2015, Syed Rizwan Farook and his wife, Tashfeen Malik, killed 14 people and wounded 22 in a San Bernardino, Calif., shooting rampage. Both were then killed in a shootout with police.
About two months later, on Feb. 9, the FBI announced that it had been unable to unlock Farook’s employer-issued iPhone 5C, and demanded that Apple provide a way to do it.
Nate Cardozo, senior staff attorney with the Electronic Freedom Foundation
A week after that, a federal magistrate upheld the demand, ordering Apple to disable the security feature that would wipe the data on the phone after more than 10 unsuccessful attempts to guess the passcode.
Apple appealed. CEO Tim Cook said the company didn’t have the capability to defeat that feature, and if it was forced to create a “backdoor” into the device, it would amount to "the software equivalent of cancer" that would endanger the privacy and security of hundreds of millions of iPhone users throughout the world.
More than 30 tech companies including Google, Facebook and Twitter, along with a host of privacy and civil rights advocates supported the appeal.
That, of course, didn’t make the issue go away. While no case since has generated that level of publicity, the conflicts continue. Last October, after a mass stabbing at a Minnesota mall linked to the terrorist group Isis, the FBI said it was seeking to unlock the iPhone of the attacker, Dahir Adan. That potential conflict never made it to the courts.
The same is true of a standoff between the government and WhatsApp, the world’s largest mobile messaging service, which is owned by Facebook. Within the past 18 months, it began encrypting communications, which meant the Justice Department couldn’t eavesdrop on them, even with a judge’s wiretap order.
Cardozo said as far as he knows, that case is currently dormant.
Of course, unlocking or decrypting devices are not the only forms of government surveillance that continue to be contentious and that have not been settled. One is the use by police departments, for more than a decade, of the Stingray – a device that “impersonates” a cell tower and thereby monitors cell phone traffic in a given area.
The manufacturer, Harris Corp., has fought to keep information about it secret, arguing that any information about it will help criminals. But it amounts to mass surveillance without a warrant – police departments frequently deploy it without a warrant, and gather information from any users in the area of the device.
There is also the change in Rule 41 of the Federal Rules of Criminal Procedure, which took effect this past Dec. 1 and allows any US judge – even a magistrate – to issue search warrants that give the FBI and law enforcement agencies the authority to hack multiple computers remotely in any jurisdiction, including outside the United States.
But, as Cardozo said, while concerning those forms of surveillance are different from the Apple case, in that they don’t require a company to, “subvert its security.”
And a high-intensity battle over that may be in the works.
From Congress, less than a month after the FBI withdrew its complaint, the Senate Select Committee on Intelligence Chairman Richard Burr (R-N.C.) and Vice Chairman Dianne Feinstein (D-Calif.) issued a draft of what they labeled the "Compliance With Court Orders Act of 2016."
The draft, which called for a mandate that, “all entities must comply with court orders to protect Americans from criminals and terrorists,” was never filed as actual legislation. But it would have required that, “covered entities,” which included, “device manufacturers, software manufacturers, electronic communication services, remote communication services, providers of wire or electronic communication services, providers of remote communication services, or any person who provides a product or method to facilitate a communication or to process or store data,” comply with court orders to turn over data in an intelligible format.
Feinstein said at the time in a press release that the proposal was not intended to undermine privacy, but simply protect the public. “We need strong encryption to protect personal data, but we also need to know when terrorists are plotting to kill Americans,” she said.
But it drew broad and loud condemnation from privacy advocates and technology experts, who said requiring a backdoor into devices would undermine security for all devices. Julian Sanchez, founding editor of Just Security and a senior fellow at the Cato Institute, called it "insanely misguided."
Backdoor for the good guys?
John Verdi, vice president of policy at the Future of Privacy Forum (FPF), was among hundreds of critics who said it couldn’t work. “Some have argued that firms like Apple can create backdoors that allow the good guys to access data, but prevent access by bad actors. Unfortunately, this isn't possible,” he said.
And John Bambenek, threat systems manager at Fidelis Cybersecurity, noted that most tech companies are global. If they are forced to provide a backdoor for US intelligence or law enforcement, it could then be used by, “less-friendly jurisdictions that may have their own motivations.”
While there have been reports since last fall that a revised version of Burr-Feinstein may be filed this year, the logistics of it are not clear, since Feinstein has moved to the Judiciary Committee.
“Burr-Feinstein will be reintroduced,” said Paul Rosenzweig, founder of Red Branch Consulting and a former deputy assistant secretary for policy at the Department of Homeland Security. “But with Feinstein at Judiciary now, the exact structure will be different.”
A report from another congressional group has received a warmer reception.
The Encryption Working Group of the House Judiciary Committee and the House Energy and Commerce Committee issued its annual report in December, which included the following “observations”:
- Any measure that weakens encryption works against the national interest.
- Encryption technology is a global technology that is widely and increasingly available around the world.
- The variety of stakeholders, technologies, and other factors create different and divergent challenges with respect to encryption and the "going dark" phenomenon, and therefore there is no one-size-fits-all solution to the encryption challenge.
- Congress should foster cooperation between the law enforcement community and technology companies.
Bambanek called the report “bang on,” especially with regard to weakening encryption, because to do so would, “work against the national interest.”
Of course, any legislation that results will depend in large measure on how the various stakeholders define “cooperation.”
There is also a bill in the works by Rep. Michael McCaul (R-Texas), who chairs the House Homeland Security Committee, and Sen. Mark Warner (D-Va.), that would create a 16-member "Encryption Commission" to report on how conflicts might be resolved. It would include tech industry executives, privacy advocates, cryptologists, law enforcement officials and members of the intelligence community.
But EFF opposes it, arguing that the “questions” the commission would address have already been answered.
They haven’t been answered to the satisfaction of all parties, of course. As Nojeim put it, “companies and law enforcement are trying to adapt to new technology, and there is no road map for how that should best be done.”
Establishing that road map will inevitably be contentious. Bruce Schneier, CTO of Resilient Systems and an encryption expert who blogs on the topic frequently, wrote in a post last month that, “there will be more government surveillance and more corporate surveillance. I expect legislative and judicial battles along several lines: a renewed call from the FBI for backdoors into encryption, more leeway for government hacking without a warrant, no controls on corporate surveillance, and more secret government demands for that corporate data.
“And if there's a major terrorist attack under Trump's watch, it'll be open season on our liberties. We may lose a lot of these battles, but we need to lose as few as possible and as little of our existing liberties as possible,” he wrote.