Why Apple is fighting the wrong encryption case

epaselect epa05180289 FBI Director James Comey speaks about the FBI's request to Apple to unlock the iPhone of the San Bernardino shooter, during a hearing before the House Select Intelligence Committee on 'World Wide Threats' in the US Capitol in Washington, DC, USA, 25 February 2016. Comey was joined by CIA Director John Brennan and Director of National Intelligence James Clapper. A federal judge last week ordered Apple to help the FBI crack the encryption on the iPhone used by Syed Rizwan Farook, who along with his wife killed 14 people in a terror attack on 02 December in San Bernardino, California. Apple boss Tim Cook refused the order in an open letter published on Apple's website, setting off a public debate about the competing demands of privacy and security.  EPA/JIM LO SCALZO

Apple’s chief executive Tim Cook is such a respected figure that it’s easy to overlook the basic problem with his argument about encryption: Cook is asserting that a private company and the interests of its customers should prevail over the public’s interest as expressed by our courts.
The San Bernardino encryption case was the wrong one to fight. Apple doubled-down on Thursday by asking a federal court to vacate its order that the company create a tool to unlock the iPhone of shooter Syed Rizwan Farook. But if a higher court ultimately requires Apple to do that, as seems likely, the company will be seen by privacy advocates at home and abroad as having been rolled by the U.S. government. Foreign governments may demand similar treatment. Neither outcome is in Apple’s interest.
Cook’s stand has added to his luster as a tech hero. But the case, unfortunately, could be a lose-lose for U.S. technology companies, eroding both privacy protections and global market share.
Cook’s February 16 “Message to Our Customers” was somewhere between a civics lesson and a sales pitch. “Smartphones, led by iPhone, have become an essential part of our lives,” Cook began. He said he wanted to protect Apple users from what he claimed was an FBI “backdoor” that “would undermine the very freedoms and liberty our government is meant to protect.”
The Justice Department reacted indignantly. “Apple has attempted to design and market its products to allow technology, rather than the law, to control access to data,” the government argued in a February 19 motion. FBI Director James Comey also blasted Apple, arguing that “corporations that sell stuff for a living” should not be allowed to set the balance between privacy and safety.
Comey told Congress on Thursday: “This is the hardest question I’ve ever seen in
government.”
Cook’s insistence that the FBI is seeking what could become a universal “backdoor” raises several interesting questions. First, Cook has argued that if Apple creates a special tool for Farook’s phone, it could be used by governments or hackers to crack any other iPhone. But Stewart Baker, a former NSA lawyer and a leading writer on encryption issues, cites an Apple security paper that suggests Apple has plenty of ways to prevent the tool from being used without Apple’s permission or on phones other than Farook’s.
“Apple’s new security architecture requires that any Apple update, including one written for law enforcement, must be uniquely tied to each individual phone that gets it,” Baker said in an email. “The phone can’t download an update unless it’s been digitally signed by Apple and then ‘locked’ to an individual phone.”
Apple has also argued that if it unlocks Farook’s phone for the FBI, it might have to provide similar help whenever it gets a legal order from foreign governments, including repressive ones in Russia or China. But it’s not clear why this would be so.
The access tool presumably will be kept in the United States. Any foreign government that wants to use it can make a request through what’s known as a Mutual Legal Assistance Treaty, which allows a U.S. judge to review the case and make sure the use fits proper legal standards. Critics complain that the process is underfunded, slow and cumbersome. So spend more, speed it up — and keep proper judicial control.
An intriguing, little noted aspect of Apple’s argument is that the U.S. government shouldn’t make such a fuss about “going dark” with encrypted iPhones because it has so many other useful surveillance techniques. Here,
I suspect Apple’s supporters are right. U.S.
intelligence agencies have indeed
devised new ways to analyze “big data”
and find patterns that
defeat the clever use of smartphone and email
encryption by terrorist groups such as the IS.
The best rebuttal of Comey’s “going dark” argument came in a report issued this month by Harvard’s Berkman Center for Internet and Society, titled “Don’t Panic.” It noted that in a world of smart
appliances and clouds of unencrypted data, people leave so much digital
exhaust that it’s still possible for intelligence agencies to find and track dangerous targets.
What’s the “net assessment” of the costs and benefits for U.S. national security in this debate? It seems clear that U.S. interests are served by a world where there is pervasive use of iPhone-style encrypted smartphones that embody American values of privacy and free exchange of information.
The Apple legal brawl could undermine this dominance. Comey’s broadsides against “going dark” may make consumers suspicious about U.S. technology. So unfortunately, do Cook’s false claims about a “backdoor.”
— Washington Post Writers Group

David-Ignatius copy

David Ignatius, best-selling author and prize-winning columnist for the Washington Post, has been covering the CIA for more than twenty-five years

Leave a Reply

Send this to a friend