On Feb. 16, a federal judge ordered Apple Inc. to help the Federal Bureau of Investigation crack the encrypted iPhone 5c of a perpetrator in the San Bernardino attack in December. The order, which Apple has refused, immediately sparked a debate about personal privacy and national security. Indrajit Ray, CSU professor of computer science in the College of Natural Sciences, is an expert in network security and applied cryptography. Here’s his view of what’s at stake in this case.
Q: What’s a simple explanation of the technology on the iPhone (and other mobile devices) that protects personal information?
A: The passcode used to just be a simple mechanism to get into the machine. But what Apple is now doing – using technologies like a passcode or touch sensor – is they create secret information, on the fly, to encrypt some of the data you’re storing. Encrypted data is a jumbled puzzle that you cannot recognize just by looking at it, and it is done with mathematical rigor so it’s impossible to break. Even the most powerful computers would take an enormously long time to break it. The way to decrypt all this complex information is to try a lot of different combinations, which is called brute forcing, and eventually you’ll succeed. So what Apple has done is enabled another mechanism on top of this one that deletes information from the phone if someone tries too many times to unlock the phone.
Q: Technically speaking, what is the government asking Apple to do?
A: The government is asking Apple for software that would allow them to try many different combinations of passcodes, and to bypass the file deletion. What they are not asking for is the encryption key – the secret piece of information that is used in all encryptions. Encryption is a locked door, and the key opens it. If you don’t have the key, you cannot open the door. That’s how the technology works. Apple does not have the key, because the key is created by the passcode.
Q: So what do you feel is the real issue here?
A: If Apple complies, the fear is that this will set a precedent. If somebody can be required by one government to do this, maybe other governments, for example, could force Apple or other companies to comply with similar requests.
I think the fear that the technology could be stolen and used to spy on people is there, but it’s not as big a concern as the idea of setting a precedent. From a technological perspective, if we could have a guarantee that the technology would only be used for this one instance, it would probably be OK. But since we do not have such a guarantee, Apple’s complying with the request is not a good thing to do.
At the same time, one could make the case that there is sometimes a legitimate need for law enforcement to be able to access personal information. But we need to come up with a proper methodology for it. Right now, that doesn’t exist.
The government arm-twisting is not the answer. There is legitimate concern from the law enforcement side, and there is a legitimate concern from the public’s side. A legal protection mechanism to balance the two sides is what’s needed here.