Apple Has Already Won. Now It Should Crack the San Bernadino iPhone

Illustration: Getty Images

As the Apple and FBI standoff continues, I asked Rick Mislan, a mobile device security expert who has worked with law enforcement agencies around the country and written for IEEE Spectrum on the topic (see “Cell Phone Crime Solvers”, and “When the Evidence is On the Cell Phone”), and his colleages Alex Hew and Damian Kumor to consider the situation and the issues involved. The following is their analysis.

Unless you’ve been completely off the grid for a week, you already know the FBI has obtained a court order requiring Apple to create a special operating system that lacks certain security features and to load it on the iPhone 5c found in the possession of one of the San Bernardino terrorists—all for the purpose of gathering evidence. Apple has refused to comply, on the grounds that doing so could compromise the privacy of all of its users, who entrust the company to protect their data.

Everybody seems to have an opinion about whether the FBI’s request is reasonable, or if Apple is doing the right thing by refusing.  And behind it all is a debate that has been going on for years: whether or not companies that encrypt data, including not only Apple, but Google, Facebook, and many others, should build in “backdoors” for law enforcement access. Such backdoors would allow law enforcement, with a search warrant, to bypass the security on mobile devices in order to gather evidence.

That is a bigger issue—one for Congress and the courts to address. Putting that question aside, there is another way to look at the Apple/FBI controversy: from the perspective of the world of mobile device forensics, a world in which we have been working for at least the past 10 years. And from that perspective, this case differs dramatically from business as usual.

Since early 2000, it has been third-party forensic experts and companies—not device manufacturers—who have provided the tools and techniques that law enforcement agencies use to access data stored in mobile devices. Among them are companies such as Access Data, Cellebrite, Compelson, ElcomSoft, Final Data, Katana, Logicube, MicroSystemation, Oxygen, Paraben, Radio Tactics, and Susteen. The tools developed by these companies are tested by the National Institute of Justice and approved to provide accurate evidence, admissible in court.

Before these tools were developed, law enforcement authorities did not use purpose-built evidence gathering technology. Instead they relied on off-the-shelf data synchronization tools—either built by the manufacturer or some other third party—which synced mobile devices with computers. In its day, the most common such tools were the Desktop Manager from Blackberry, Microsoft’s own sync tools for its PocketPC and CE devices, and DataPilot from Susteen (advertised, it seemed, on the back of every SkyMall magazine during the late 1990s). These all could create a backup of the data accessible from the device. However, these tools lacked—besides forensic hashing techniques that ensured data integrity—the ability to bypass the built-in security of a device. So the user had to know the password or PIN needed to access the phone’s data.

In contrast, the commercial forensic tools get around the password, generally by exploiting existing vulnerabilities in the phone’s software or introducing a forensic bootloader, a unique piece of the software that loads a version of the operating system to allow for the extraction of the device’s file system.

Of course, keeping up with newly released devices and software is a big challenge for the makers of mobile device forensic tools. Cellebrite has agreements with many international carriers, so they can periodically update their forensic systems to connect to and extract data from new devices before they hit the market. Cellebrite tools work on 32-bit Apple devices up to iOS 8.4.1. Because Apple does not directly cooperate with Cellebrite, it takes the company some time to update its system to perform forensic analysis on new Apple devices.

Even though the tools can’t quite stay abreast of new operating system releases, the U.S. government has historically been able to get into iPhones—and Androids, for that matter—because Apple and Google have helped them. In fact, Apple has reportedly unlocked iPhones for the U.S. government at least 70 times. And Apple has made users (at least, the few who read the fine print) aware of that possibility.

Recently, though, the ground shifted a bit: Apple published in their Legal Process Guidelines that they can provide the service for “Extracting Data from Passcode Locked iOS Devices.” But they maintained the caveat that:

For all devices running iOS 8.0 and later versions, Apple will not perform iOS data extractions as data extraction tools are no longer effective. The files to be extracted are protected by an encryption key that is tied to the user’s passcode, which Apple does not possess.

So, before iOS 8 (which was introduced in June of 2014) with the right legal paperwork in hand law enforcement would ask Apple to provide access to the phone; Apple would then unlock the phone, retrieve the data, provide a copy of the data on the law enforcement data drive, and return the device. Or, in the case of iOS 8, law enforcement could obtain third party tools to retrieve the data without Apple’s help.

Meanwhile, Apple kept working on patching up flaws in iOS that allowed unauthorized security access by hackers and also by law enforcement. And they went even farther to tighten security for iOS 9, the system software on the San Bernardino phone.

With iOS 9, the iPhone automatically encrypts itself with a hidden key.  This key is made up of a value unique to the phone, combined with the user’s password or PIN. That key is used with AES-256, an open and very strong form of encryption with no known weaknesses.

As a result, someone getting into the phone without the PIN, as third-party tools do, would only see gibberish, because even though the phone’s data would be visible, it would still be encrypted. As a further protection, if too many incorrect attempts are made to guess the password or PIN, the phone erases its key, preventing anyone from ever decoding the encrypted data.

It is unclear whether these third parties will be successful in designing a tool that can access and decrypt the data without a PIN.  Since there is no intentional workaround built into the operating system, these third parties would need to find an unintentional vulnerability in the phone. While nothing is perfectly secure, Apple already patched some vulnerabilities that have been discovered. And of course, if such a tool were successfully created, it would just be a matter of time until that security hole was also patched.  Essentially, we can kick the can down the road, but eventually it is going to be only Apple who can access these devices, no matter what third party toolmakers do.

Without Apple storing passwords, and without a third party forensics tool in the hands of law enforcement, the U.S. government has requested that Apple create a  “signed iPhone Software file, recovery bundle, or other Software Image File (SIF)”. This would allow the bypassing or disabling of the auto-erase function, allowing the FBI to “submit passcodes” without the built-in delays that are meant to make such a brute force approach impractical. The plan would be to install this new file or bundle, essentially a patched operating system without the security safeguards, on just this one phone, allowing the FBI to get any evidence that may be present. Recently, the Department of Justice has indicated that Apple can maintain control of this software after it has been used.

So what’s the big deal? Tim Cook, in his message to Apple’s customers, said the main issue is that the order would require Apple to create a specific new software file or bundle for one specific phone. The new firmware would be digitally signed by Apple, and tied to just the serial number of this phone.  

Theoretically, this link to the serial number should make it impossible to use on another phone. Any iPhone will reject firmware that is not digitally signed by Apple, and in this case the serial number would be incorporated into that process. Only Apple holds the keys needed to sign the firmware, so even if the firmware were available to anyone, using it would be impossible on any other phones. 

However, it’s not quite that simple. The methodology used to get into the phone is where the real value is, not the firmware itself.  Once Apple develops such a tool, the “how” of getting around the PIN lock screen will be out of the bag. It could certainly be used on “jailbroken” phones, which don’t need new software to be signed.  Jailbroken phones are common both in the United States, and internationally. For standard Apple phones, security going forward would rest on the hope that the new code would be impossible to get onto any other phones. 

In a way, the FBI’s request is less like asking a locksmith to lend a master key for a single use and then destroying it, than it is like requesting a vault maker to make the tool to break into his own vaults. Once this tool is created, then the big question will be who else would have access to it. Even if Apple controls the tool, will it be allowed to be used for the United States government only in terrorism cases? Or will it, in a more likely situation, have to be made available to the governments of every country in which Apple operates, for use in any way those governments wish?  One can imagine uses not originally planned for, like collecting data from political opponents or corporate interests. 

On the other hand, if the tool does not exist, then Apple is able to avoid entering this thicket of moral and legal issues.

There are other issues at stake, of course.  Some argue that the controversy is not based in anything technical, but rather, is simply Apple making a stand in light of the privacy concerns many Americans are feeling after the leaked NSA documents from whistleblower Edward Snowden. And Apple is not alone, as other major industry players including rival Google and other tech companies have voiced their support for Apple’s stance.

It is worth considering whether or not the lack of access to evidence on smartphones is hampering investigations. The lack of access is something new; since the inception of the technology evidence has been able to be pulled off of smartphones for use in investigations. Outside of users deleting the data on the phone or utilizing a separate program to encrypt their private data, there has always, until now, been a way to access the phone's data.

The notoriety of this case itself—perhaps more than how it is resolved—may affect future investigations. It remains to be seen if these visible public discussions push users, criminals, or terrorists to secure their phones with a separate program that encrypts their private data; one likely coming from a developer far out of reach of the FBI.

Apple is also making a business stand.  More than half of smartphone users in the United States own an iPhone and paid a premium for it; and Apple wants to continue to differentiate themselves from competitors in order to maintain their premium pricing. One of its strengths is the trust users have in the security of their products. For years, this kind of trust was what kept Blackberry so popular in the government sector. President Obama was famously attached to his Blackberry and the NSA even permitted the use of a modified version, trusting Blackberry’s security.  Such trust kept Blackberry in a strong market position until about 2014, when Android, Apple and Windows phones finally surpassed them.

Another interesting question  is whether the FBI really believes that there is useful evidence on the phone, or is it simply using a notorious case to open a door for future use? Once the precedent has been set for one case, it should be easier to justify in the future.  Given that the San Bernardino shooting occurred on 2 December 2015, and the phone was recovered from the vehicle, why the concern now, months later, and not then?

As the device was seized on 2 December, the standard practice would have been to preserve the iPhone by keeping it off of the network but on a charger. Unfortunately, we don’t know if that was the case. If it was, then it is quite possible that Siri, the iPhone’s voice-activated artificial intelligence, could have provided some useful information.

Jonathan Zdziarski, a well-known expert and author in iPhone forensics, points out that the Apple service “Find my iPhone” was still active on the iPhone and wondered why such a person would use a phone that was possibly tracking him. And much important data, such as calls made, SMS sender and recipient numbers, and towers connected, could be obtained from the cell phone carrier, not the phone itself.

Zdziarksi also noted that the iPhone in question was making iCloud backups until a month and a half before the shooting spree, and that those backups were empty.  It is quite possible that knowing of the upcoming events, the owner didn’t want anything pushed up to the cloud. It was also reported that the password for the iCloud account was reset by an IT employee of San Bernardino County Department of Public Health while the phone was in the FBI’s possession, cutting off a possible path Apple suggested for getting the data without unlocking the phone. It is unclear as to whether he was doing this under his own volition or under FBI request.

As mobile device forensics experts, we believe that, since Apple can create the software with confidence that it can never ever be used on another iPhone, they should reluctantly comply.

In some ways Apple has already won. Apple’s public stance has brought the eyes of the world to the battle between an international technology company who is fighting for your privacy and the U.S. Government that can’t unlock the phone and turns to bullying a company into doing it for them. The world is waiting to see who will flinch first, Apple or the FBI. 

And the FBI has already lost, with, as we pointed out, any criminal looking for data privacy now alerted that security software beyond those provided by a device manufacturer should be employed.

Of course, this situation is much greater than just the matter of unlocking a single iPhone. This decision will be precedent setting for all future matters related to encryption protection as it pertains to U.S. law enforcement accessing the devices and data belonging to private citizens. So far, Apple has already obtained some concessions from the U.S. Department of Justice, in particular, allowing the company to maintain full control and even destroy the software after use in this one case.

However, that’s not to say that Apple should bow to every government request in the future. Because the FBI has been quite clear about what it wants—a backdoor to every encrypted system. And that is not a good idea because you cannot make a backdoor for just law enforcement; the backdoor would possibly be accessible to anyone with enough skills and knowledge. So if these requests begin to multiply to the point where they are routine, or assumed in every case involving a phone, Apple and its brethren will then need to make a stand.

Alex Hew is a graduate student in computing security at the Rochester Institute of Technology. He also works full-time in the defense sector.

Damian Kumor is currently a computer security masters student at Rochester Institute of Technology and a Project Manager at Xerox.

Rick Mislan is a visiting professor at RIT, where he created the Mobile Security and Forensics curriculum and provides forensic support to the RIT Information Security Office. While at Purdue University, he created the annual Mobile Forensics World conference. He is a former Electronic Warfare Officer in the US Army and continues to support various law enforcement, military and intelligence agencies worldwide.

Advertisement

View From the Valley

IEEE Spectrum’s blog featuring the people, places, and passions of the world of technologists in Silicon Valley and its environs.
Contact us:  t.perry@ieee.org

Senior Editor
Tekla Perry
Palo Alto
Advertisement