Apple, again, having to defend it’s decision not to include backdoors in Apple products to Justice Department

Right now, the US Justice Department is trying to compel Apple to help them in break into an Apple device to assist in the investigation into the Pensacola terrorist attack. They [the Justice Department] also want Apple to include backdoors in their products moving forward to make it easier for law enforcement and other government entities to get this access in the future. All politics aside, both of these are bad ideas for some very fundamental reasons.

As far as anyone knows, Apple devices are not manufactured with any backdoors or handicaps in their security. This is by design, to protect the privacy and security of their customers. Based on that, Apple isn’t being obstinate, they legitimately do not have the ability to do what the Justice Department is asking. There are no magic backdoors that Apple can open on the devices, and that’s a good thing. It does make it inconvenient for law enforcement sometimes, but it protects the security and privacy of the users (some who are law enforcement, government officials and other potentiall high value targets for attackers) all the time.

Regarding the requirement for any company to intentionally handicap the security of their products, teh same argument applies with a slight caveat. Right now, it’s safe to assume that there are no known (to anyone, good guys or bad) vulnerabilities in the Apple devices (ideally, the entire Apple ecosystem) that haven’t already been patched. It’s possible that such vulnerabilities may be discovered in the future, by good guys or bad, but there are none baked in. Finding the vulnerabilities and building an exploit to take advantage of them now requires significant technical skill, time and other resources that most drive by attackers or common criminals do not have. As they are found, they’re generally disclosed or discovered quickly and Apple patches them so that the bad guys can’t take advantage. Requiring Apple to intentionally include a backdoor means those backdoors have to be documented, developed, tested and deployed into production. Training materials on how to use those backdoors will have to be developed, edited, approved, packaged and delivered to those who have demonstrated a need to know and, from there, taught to those who will actually be using the backdoors. If any point of that documentation, development, deployment or training process is leaked or compromised, Pandora’s box is open. I do understand that appropriate steps will be taken to safeguard these backdoors but the reality is that it won’t be enough. Safeguards were in place to prevent the Target breach, but it happened. Safeguards were in place to prevent the Anthem breach, but it happened. Safeguards were in place to prevent the Equifax breach, but it happened. These are all private organizations but the same holds true with government sites as well. The OPM breach was likely not planned. The breach at the NSA that leaked a treasure trove of vulnerabilities (ETERNALBLUE, ETERNALROMANCE, etc.) that later resulted in a flood of ransomware attacks including WannaCry, Petya, Bad Rabbit and others.

Ultimately, although it is unfortunate that the terrorist in Pensacola used good operational security (opsec), there is no way for Apple to bypass the security in his phone to grant access to law enforcement. Additionally, the long term and permanent danger created by forcing Apple (or any vendor) to include backdoors in it’s products far outweight any short term benefit.

Leave a Reply