Why Apple’s fight with FBI is so hard to referee

Protesters demonstrate outside the Federal Bureau of Investigation (FBI) headquarters building in Washington, DC, February 23, 2016, objecting to the US government attempt to put a backdoor into the Apple I-Phone. Apple is battling the US government over unlocking devices in at least 10 cases in addition to its high-profile dispute involving the iPhone of one of the San Bernardino attackers, court documents show. / AFP / PAUL J. RICHARDS

The fight between Apple and the U.S. government comes down to a technical enigma wrapped in layers of emotional debate.
On the surface, people seem to be drawn to opposing sides depending on feelings: fear of terrorism, or suspicion of government, distrust of corporations.
But the crux of the disagreement comes down to a technical question, not a gut feeling: whether it’s possible for Apple to disable its own security system to break into a deceased terrorist’s iPhone without jeopardizing the security of all iPhones.
Since Apple software is proprietary, the answer to the technical question remains shrouded in uncertainty. Still, decisions need to be made, and good policy can be formulated in uncertain situations, just as health authorities had to respond to theZika outbreak though little was known about its potential health consequences.
In the Apple case, decisions will affect the way we balance the fight against terrorism with concerns over the erosion of privacy in a world increasingly dependent on smart phones to track and guide people’s lives.
The phone in question belonged to Syed Rizwan Farook, who, along with his wife, shot 14 people in San Bernardino last December. Unlocking it may not sound like a hard problem for the technical wizards at Apple, but computer scientists and cryptography experts say it may indeed be impossible, which is what the company claimed in a letter to customers issued Feb. 16. Apple set up the security system so even its creators can’t break into a customer’s phone without creating new software to make it possible.
The hard part is bypassing a feature that deletes sensitive data if someone types in an incorrect password more than 10 times. That’s important because it’s surprisingly easy to guess a six-digit password.
If not for such a limit, the FBI, or hackers for that matter, could use what’s been called a brute force approach to try every combination of six numerals until they hit on the right one. There are a total of 1,000,000 such combinations, and on average it should take about half a million guesses before the system is cracked.
That’s a lot of tapping for a person who steals a phone, but not for a fast computer, said Cornell University computer scientist Steven Wicker. Most computer science grad students could create such a system to enter perhaps 1,000 possible passwords per second, he said. That would hit upon to the right one in 15 minutes or less, depending on luck.
Code makers stay ahead of code breakers by making strings of characters so long that it would take an eternity to try even a fraction of the possibilities.
The fact that codes can be made to withstand brute force attacks, at least for the next few million years, allows people to send credit card information over the Internet with reasonable safety, said Indrajit Ray, a computer scientist from Colorado State University.
In a simple example, he said, imagine the formula A + B = C. A is your secret number. B is a number only known to a colleague you want to receive your secret message. Your colleague sees C and subtracts to discover A, but a hacker, picking up just C, would have to guess every possible value of B to get to your secret. If C is 1,000, that might not be so hard, but if it’s 20 billion or trillion, it gets unwieldy.
In the real world, encryption systems are more complicated but the idea is the same – some operation shuffles your secret numbers and only those holding a key can unshuffle them. To keep the key safe from brute force, it has to be big.
Apple can’t rely on large numbers to keep passcodes safe, because it would mean using more than 40 digits. “It’s a human factor”, said Cornell’s Wicker. People would refuse to do it or would choose strings of a single digit, thus defeating the purpose.
So instead, iPhones include a security option to erase data after 10 incorrect attempts to guess a password. To disable it will mean writing new operating system software. This is what the experts are calling a back door.
“A back door is an intentional weakness in the system that allows a group that’s in the know to get in,” said Wicker. Apple doesn’t already have a back door into iPhones, and so, the company claims, one would have to be created. “They’re being asked to create a tool that could potentially break into any iPhone,” Wicker said.
Ray, the computer scientist from Colorado State, compared it to a situation in which there’s a secure storage facility with no master key. Imagine that the FBI wanted to get into one of the lockers, he said, but only the owner knows how the locks work. Can the owner create a crowbar that can only open one locker but is useless for all the others?
It’s hard for anyone outside Apple to know whether it’s possible to make such a specialized single-locker crowbar, said Ray, since the company’s software codes are secret. And yet, people on both sides of the debate tend to be sure, either that it is or is not possible.
Good decision making in cases like this rests on understanding the boundaries of current knowledge and resisting the temptation to fill in the gaps with assumptions.
There’s a lot at stake, with the explosion of smart phones now tracking our whereabouts, our interests, health data, contacts and appointments. “It’s a permanent record of our daily lives,” said Wicker, and it’s potentially available to both commercial interests and the federal government. “A lot has gone on without public debate,” he said.
The experts laud Apple for prompting that long-needed public discussion. Court documents unsealed this week revealed that the FBI has made similar requests of Apple in nine other cases.
This is a good test case to start considering where we want to draw the line between the need to helping crime fighters and the risk of a future in which we can’t keep much of
anything secret.
—Bloomberg

Headshot of Faye Flam. (Charles Fox / Inquirer)  SIG- FLAM, FAYE, 4/26/2007,INQUIRER HEADSHOTS

Faye Flam is an American journalist. She has written
for Science Magazine and wrote two weekly columns for
The Philadelphia Inquirer, one on evolution

Leave a Reply

Send this to a friend