The FBI may have unlocked the iPhone 5C held by a San Bernardino shooter without Apple’s help, but the agency and the world’s largest tech company are still at odds over whether law enforcement should be granted access into the smartphones of suspects and criminals.
On Tuesday, a U.S. House of Representatives subcommittee will hear arguments from Apple and the FBI on how best to weigh the privacy and security of citizens in such cases. Amid the dispute, the CEO of a mobile forensics company has proposed a controversial “backdoor” solution based in public key cryptography that he says represents the best possible compromise between the two.
However, several cybersecurity and computer science experts interviewed by IEEE Spectrum disagree, saying that this type of access creates vulnerabilities and is of limited value to law enforcement.
Though many experts have derided the FBI’s attempts to force Apple to hack an iPhone, few have offered concrete proposals for methods to bridge the disagreement between technology companies and government officials. Many see restricted access as an inevitable consequence of better security protocols.
Joel Bollö, CEO of mobile forensics company MSAB, is one of the few offering a solution. Bollö is trying to drum up support for his idea, which he outlined in an opinion piece in The Detroit News in early April. His approach is based in public key cryptography, which is an encryption technique widely used to secure web browsers and financial transactions.
Public key cryptography enables two or more parties to share information using “keys,” which consist of very large numbers. Users can encrypt data using a publicly available key provided by one person in the exchange. That person can then decrypt messages intended for them by using another key that they have kept secret. “The idea is that you split the keys so it's like a handshake to open these phones,” Bollö says.
In Bollö’s proposal, tech manufacturers and law enforcement would share a set of many public and private keys. Whenever Apple manufactured a new iPhone, the company would encrypt its contents using a public key, and give the matching private key to decrypt information stored on that particular phone to the FBI.
Previous “backdoor” proposals have come under fire for opening electronic devices to mass government surveillance and creating an easy target for hackers. Bollö says his tactic, which he calls Forensic Access Control Technology or FACT, is different because it requires officials to physically have a phone in their possession. He adds this constraint also limits its appeal to hackers, since they could only attack phones they had stolen.
Even so, Rick Mislan, a mobile device forensics expert at Rochester Institute of Technology, is wary of the U.S. government’s ability to keep any such key secure. Last year, hackers retrieved data identifying 21.5 million people from the Office of Personnel Management.
“There needs to be some type of access for a responsible person for that device,” says Mislan. “But the question is, how do we trust the agency or that authority to use that with care?”
As an extra precaution, Bollö thinks the FBI should store all private keys on a smartcard only accessible through a six-digit passcode. He also prefers to make many keys instead of just one. House and car keys, for example, are often printed in the same shape but twin copies are so widely dispersed that the practice still keeps these assets secure.
Still, Bollö admits his strategy is less secure than not creating any keys in the first place.
Robert Cunningham, chair of the IEEE Cybersecurity Initiative, says Bollö’s proposal reminds him a lot of the Clipper Chip fiasco of the 1990s. At the time, the U.S. National Security Agency tried to persuade telecommunications companies to insert a chip, known as the Clipper Chip, into devices to record calls. The calls could later be retrieved by law enforcement using public key cryptography.
Similar to the protections built in to Bollö’s proposal, Clipper Chip keys were unique for each phone and securely stored on floppy disks in a safe. But the chip didn’t last more than a few years on the market. One reason was that the chip had vulnerabilities that made it susceptible to hackers.
Now, Cunningham is hesitant to say that any solution to the Apple and FBI case will stick until society agrees on a clearer definition of security in the face of privacy concerns. “I can't figure out what the right definition of security would be myself,” he says.
Dorothy Denning, a cybersecurity expert at the U.S. Naval Postgraduate School, was a longtime supporter of the Clipper Chip. But even she says that times have changed, and the public debate around security and privacy is much more raucous today.
“You see a lot more vocal opposition to the idea of mandating government access now than you did before,” she says. “We have a lot more people online now who are more familiar with the issues. I think there's a lot more concern about having security and not putting backdoors in intentionally and making things potentially weaker.”
Dylan Ayrey, a security engineer with the information security company Praetorian, points out that an iPhone’s lock screen is only the first barrier to its contents. WhatsApp recently announced that it would use end-to-end encryption for all its messaging services.
“Encryption is here whether we want it to be or not,” he says. “The landscape has changed forever, mostly for the better, and these types of proposals can't gain traction.”
However just last week, two U.S. senators introduced legislation to require tech companies to unlock phones and provide other “technical assistance” to government officials with a court order.
Mislan at Rochester Institute of Technology points out that as CEO of one of the world’s leading mobile forensics companies, Bollö would be uniquely positioned to profit from any software designed to execute FACT. Bollö insists that his competitors could do the same and says he is focused on developing a new industry standard rather than a software sales pitch.
"Of course, we could develop this stuff and we'd be happy to do that but I'm not trying to push that,” Bollö says. “I'm trying to say, ‘Here's a solution that could work for everyone.’"
Rather than require companies to install a key, Praetorian’s Ayrey thinks the best solution is for mobile forensics companies such as MSAB and Praetorian to continue doing what they have been doing all along—finding vulnerabilities in each new device or operating system that is released, and exploiting those holes on behalf of clients until an update renders them obsolete.
For consumers, Mislan has an even simpler strategy. “For me, it boils down to: If you really want to protect something, don't put it on your phone,” he says.