Passwords, privacy and protection: can Apple meet FBI’s demand without creating a ‘backdoor’?

What the FBI is and isn’t asking for
The feds aren’t demanding Apple create a “backdoor.” In encryption, a backdoor is when someone has a means to access protected content outside of the normal (frontdoor) process. For example, there could be a skeleton key built into the encryption mechanism. The National Institute for Standards and Technology is reputed to have built such a facility into a random number generator, a function used in the heart of most encryption techniques.

Encryption with a backdoor is technology explicitly designed so that a third party – in most cases, law enforcement – can gain access to the protected data when the need arises. But it’s very hard to build a backdoor into encryption, while still making it hard for an attacker to defeat. I don’t believe anyone is calling for such encryption anymore.

Rather than tinker with its encryption, the FBI says it has asked Apple only to modify the defense mechanism built into iOS, its operating system. It’s presumably easy for Apple to create a version of iOS where the delay and data erase features are turned off. This would be a new, less secure version of the standard operating system.

This less secure operating system could be loaded on to the Farook phone, which the FBI could then access more easily. Other iPhones would not be affected.

Software piracy is a major challenge here. Apple has to worry that copies of this insecure operating system may get out and become easily available – and not just to the good guys, but also the bad guys. It’s common practice for software to require that a license be verified explicitly with the software vendor. If the license is not verified, the software will not function. This mechanism can block the insecure operating system from normal use.

But if the insecure operating system is installed for the purpose of data theft, then this normal license protection may not help – even if it doesn’t allow normal use, it may not stop data access. In other words, it could be problematic if copies of this insecure operating system proliferate. However, it doesn’t seem that hard to make sure that a one-time use operating system never leaks out.

It therefore appears there are no major technical barriers, or even immediate consequent difficulties, that prevent Apple from complying with the court order. Furthermore, it is hard to imagine a stronger case for law enforcement to gain access to encrypted data. In fact, a survey finds only 38 percent of Americans side with Apple and agree that they shouldn’t unlock the terror suspect’s phone. Nevertheless, there remain issues.

Our secure systems already fail all the time
It’s not easy to build a secure system. We have so many breaches reported every day, in spite of the best efforts of so many. And the defenses that Apple has been asked to remove have already been violated, at least for some versions of Apple’s products. Every additional wrinkle in the system design makes it more likely that new exploits will be found.

There is little question that this particular request from FBI will not be the last one. In all likelihood, Apple would be asked to use the desired insecure iOS in other future situations. With every use, the possibility increases of the software being leaked.

It’s also worth noting law enforcement does have access to the data in encrypted form without any help from Apple. These encrypted data look like gobbledygook and must be decrypted before they make sense. (In contrast, if they had, or could guess, the PIN, they would directly have access to the data in the convenient form ordinary users see.)

The point of encryption is to make decryption hard. However, hard does not mean impossible. The FBI could decrypt this data, with sufficient effort and computational power, and they could do this with no help from Apple. However, this route would be expensive, and would take some time. In effect, what they’re requesting of Apple is to make their job easier, cheaper and faster.

Ultimately, how this matter gets resolved may depend more on the big-picture question of what privacy rights we as a society want for the data we record on our personal devices. Understanding the technical questions can inform this discussion.

H. V. Jagadish is Bernard A Galler Collegiate Professor of Electrical Engineering and Computer Science, University of Michigan. This article is published courtesy of The Conversation (under Creative Commons-Attribution/No derivative).