I’ve talked a number of times about researchers creating security busting software just because they can. The software often gets out into the wild where people who wouldn’t normally have a clue as to how to overcome security features can now use it to break the latest security in some product or application. Now the government is trying to force Apple (and probably other vendors) to write such software in pursuit of information hidden by encryption based on the mandates of a 227 year old law written at a time when no one had any idea that modern digital devices would even exist. The decree issued by the judge in charge of the case seems quite reasonable until you consider the fact that once Apple writes the software, it could end up in the wild, where hackers will almost certainly find ways to use it to overcome the security of legitimate users—making it impossible to ensure private information, such as credit card data, really does remain private.
The iPhone comes with some interesting security features that make it a relatively secure device. For example, tampering with certain device hardware will brick the device, which is the sort of security feature more devices should have. Modifying the security hardware should cause the device to lock down in order to protect the data it contains. The encryption that Apple offers with the iPhone is also first rate. No one but the user has the key used to unlock the encryption, which means that only the user can create a security problem by handing the key out to others.
The government is trying to change this scenario to make it easier to learn about anything it can about the data on Syed Rizwan Farook’s iPhone (one of the two San Bernardino shooters). On the surface, it seems like a good idea, if for no other reason than to potentially prevent other shootings. However, the manner in which the government has pursued the information opens the door to all sorts of abuse and then there is the matter of that software getting out into the wild. The issue here is that the law hasn’t kept up with technology, which is a recurrent problem. The government doesn’t have a law to cover the need to break encryption in a reasonable way, so it resorts to a 227 year old law that was never intended to address this need. The fact that the government is using the same law to try to force Apple to breach iPhone security in at least twelve other cases means that the argument that this is a one-off requirement doesn’t hold any water. Once Apple cooperates even once, it sets a precedent that will allow the government to force additional cooperation, even when such cooperation decidedly damages the privacy of innocent parties.
Tim Cook has rightly refused to cooperate with the government. There really is too much at stake in this case and even the government should be able to figure it out. What needs to happen is that our government needs to catch up with technology and write laws that everyone can live with to deal with the need to preserve the privacy engendered by encryption, yet make it possible for the government to obtain information needed to solve a case.
The question here is more complicated than simply managing information properly. It’s also one of keeping good technology (such as that found in Security for Web Developers) working properly and ensuring that government entities don’t abuse their positions. What is your take on the San Bernardino shooting and the information needed to pursue it? How do you feel about keeping your private data truly private? Let me know at John@JohnMuellerBooks.com.