I’ve been working for quite a while on a longer piece about the argument that “backdoors make us less secure,” an article of faith among cryptographers, hackers and computer scientists that is adhered to with such condescension, vehemence (and at times, venom) that I can’t help but want to subject it to the closest scrutiny (I’ve previously written a bit about these issues with regard to the financial technology communications system Symphony and full-scaled secrecy systems like Tor).
Leaving the more general question for a later time, I’ve noticed something in the discussion of Apple’s refusal to comply with a Federal court order regarding an iPhone used by alleged San Bernardino mass shooter Syed Rizwan Farook that puzzles me a bit, and that ties to the more general ideology that underlies the cypherpunk & cyberlibertarian ideologies that seem to me everywhere visible in these debates (which I and a few others think makes these debates much weaker than they should be).
Namely: when we speak of “backdoors” that “make us less secure,” is the point that a) the creation of an actual backdoor will make systems less secure, because that backdoor could be discovered by opponents–or, probably more to the point, released by an untrustworthy actor inside the vendor itself?
Or is the point b) that a system in which it is possible to create a backdoor is already inherently backdoored, whether or not the backdoor has been created—that is, it is virtual and not actual backdoors (or at least in addition to actual backdoors) that make systems vulnerable?
The idea of a “backdoor” is metaphorical. What it means in any particular system and in any given instance may be similar to or different from other backdoors. So we are talking at a level of very general principle; but then again the notion that “backdoors make us less secure” is asserted at just this level.
As a general principle, the point of the “backdoors make us less secure” argument seems to be that if you build a system with a “backdoor” in it, then anyone will be able to find it, not just the people whom you want to have access to it.
Leaving aside this general issue, the situation in #AppleVsFBI is not that one. Nobody is talking about modifying the released system software on existing phones. Nobody is talking about creating a new version of iOS that includes a new “backdoor” for the Feds to use: they are instead talking about Apple building a tool to modify the existing software. That tool will not directly affect existing phones in any way whatsoever.
Here is how Apple descries the situation in its Feb 16, 2016 “Letter to Our Customers.” Apple claims that creating the new version of its software the Court order requires would itself be a general backdoor, since the Order specifically requires that it be usable only on the specific iPhone:
The FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
…
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.

Photo from Portland #AppleVsFBI protest (image source: MIke Bivens on Twitter)
While the FBI asks to be given the tool to do this itself, their request and the judge’s order clearly allow Apple to do this entire operation within its own physical location (see the second part of paragraph 3 of the judge’s order and footnote 4 of the FBI’s motion).
So Apple’s argument is, if we allow our own technicians to develop a technique to hack one iPhone, and that hack never gets outside of Apple’s internal security, the very fact that it exists would mean that a “backdoor” has been introduced into the iOS ecosystem and all users would be put at risk—not because anyone actually got access to the modified OS, but merely because the creation of the technique—this is the specific word Apple uses—would inherently weaken all the existing iPhones that have not been touched in any way.
The problem with this argument is subtle until you see it.
Apple itself admits that the FBI is not asking Apple to introduce backdoored software into all its phones—the traditional meaning of “backdoor” as the “backdoors make us less secure argument” would have it. It is, instead, asking Apple to develop a technique that could allow Apple and Apple alone to hack into the iOS software as it currently exists (or more accurately, to replace the OS on a given phone).
But either iOS is hackable through this method or it isn’t. There is plenty of reason to think that Apple has at least determined whether it is technically possible or not, and it is notable that Apple does not claim in its letter that what the FBI wants is technically impossible.
Let’s call the new version of iOS “hOS.”
Apple’s argument is: the mere creation of hOS creates a backdoor in all iPhones, even though nobody has access to hOS itself outside of Apple (and possibly the FBI).
Because remember, Apple has not argued that either its own in-house security, or the FBI’s internal security, are poor enough that the specific instance of hOS would get loose.
The problem is that either hOS is possible or it isn’t.
The problem is that if hOS is possible, it’s already possible. Nothing about actually creating it makes it more or less possible.
Therefore Apple’s master argument fails. Apple says that if hOS is possible, its products have weakened security, and everyone suffers.
Yet Apple has already admitted that hOS is possible. Nobody’s written it yet (as far as we know), but someone could—presumably, even someone with the requisite technical skill outside of Apple, like the at least moderately lunatic John McAfee.
If simply admitting that hOS is possible somehow tips off hackers to the insecurity of iOS and makes hacking it suddenly much more likely, then Apple’s own “Letter to Our Customers” inadvertently does just this.
My own view is that the “backdoors make us less secure” argument is far less determinative and airtight than its loudest advocates want to claim, for reasons I hope to explore later, in addition to the ones offered here. But in this case, Apple has already admitted that it can create hOS.
Its only remaining argument is that its own internal security is so weak that it can’t guarantee that, once hOS is written for one phone, its own employees won’t figure out ways to apply it to others and to sell that capability to others. But if this is true, by its own admission, such a breach is already possible. Maybe actually writing it adds a tiny amount more risk, but it’s hard to see how much risk it would add, especially since, for all the reasons we wouldn’t know whether someone inside Apple has “stolen” hOS after it was written, we also don’t know whether someone has already written it. The “backdoor” in this case is virtual, not real: it is the mere possibility of its existing that is the danger Apple points at—a possibility that Apple itself seems to acknowledge openly in its letter, thereby itself creating the “backdoor” that it claims to be defying the court order in order to forestall.
Ironically, the case itself gives some evidence against all of this. If the simple fact that iOS is virtually hackable means it is actually hacked, the FBI would not need to be going to much trouble to get Apple to hack it. Further, this is one of the places where encryptionists want to have it both ways, suggesting that the FBI can hack it (or that NSA can) but is working through the courts for some conspiratorial reason or other—which, if true, makes this entire conversation moot, although the encryptionists never want to admit that. I see no reason to accept this contorted logic. It seems to me much more plausible that writing hOS is hard, is best done by those with direct expertise in iOS, and that Apple’s security is perfectly adequate to make sure that others outside Apple don’t learn to spread and copy hOS and apply it to other phones. And note that if this is wrong, we are already in the bad place Apple claims that writing hOS would put us in, because Apple can’t trust its own security enough to make sure that employees aren’t selling its secrets to hackers—or that some of its own employees might be hackers, which is probably the case.
A final note on a related point: few seem to have noticed that along the way, it’s been made clear that Apple routinely provides access to encrypted backups of iPhone data in its iCloud service (see, e.g., discussion here). If it is the case that hOS constitutes a backdoor, then it is even more clearly the case that the existing ability to decrypt any iCloud backup of a phone counts as a backdoor. Yet nobody is screaming about this existing backdoor; they are instead arguing that the creation of this new backdoor, despite being targeted to a single device, would produce a devastating loss in security, despite the fact that it is still less pervasive than what currently exists in iCloud and what is not, as far as I know, even being proposed for conversion to a non-decyrptable system. None of the dangers we are told result from the “weakening of encryption” associated with “backdoors” appear to have resulted; iCloud backups remain a relatively secure and private service, although of course some technologists recommend against using them. Even the fact that Apple’s push toward total encryption seems to have been initiated in part due to prior breaches of iCloud, especially hacking of celebrity accounts, do not appear to have necessitated a version of iCloud Apple itself can’t access; we’ve had many fewer breaches of the system, as far as my scanning of the news feeds tells me, in the years following these breaches.
So the argument (attributed here to a Berkman Center founder) that “if Apple says yes to the U.S. government, it will make it harder to say no in countries with very different values” is an odd one to make, since we are already in that situation across the board: it is only Apple’s creation of system to which law enforcement might not be able to get access that changes things.
So what? As I’ve often said, an iterative approach to these matters—one that is consistent with other Silicon Valley practices, as opposed to the blanket “no backdoors” dictum—makes a lot of sense, especially given that certain schemes, such as the new absolutely undecryptable (at least given current technologies and resources) iMessage system, which law enforcement is in my opinion very justifiably concerned about, are being developed and sold. These systems advertise themselves as putting communication channels outside and above all legal, targeted legal investigation—whether for regulation or law enforcement—and are supported by people whose hatred for the US Government translates quickly into a hatred of all government, which licenses building a system that no government, no matter how “good,” could penetrate. One sees marks of that hatred in Apple’s own recent public discourse, especially when the same Tim Cook who speaks so strongly about #AppleVsFBI says that claims the company should pay more taxes than it does are “total political crap.”
This is cyberlibertarianism in action: the unacknowledged importation of far-right concepts and themes, in this case the idea that “government” and “evil” are absolute synonyms—into the discourse of digital technologists who do not see themselves as aligned with the far right. One can only support systems like iMessage if one believes that the very idea of government is offensive—not just our government, but any government whatsoever. Despite the many ways in which such a view is contradictory and incoherent, it remains a widespread crie de couer among many on the far right, and they do their best to spread that message widely. If you think government should not exist (a view I find largely incoherent, but we need to talk about that on its own terms), we should have that political discussion. We should not be having companies build tools whose partly-stated reason for being is to disable vital functions of government without quite saying so.
One Trackback
[…] “Backdoors” Real or Virtual? The Logical Flaw in #AppleVsFBI” [Uncomputing]. “If you think government should not exist (a view I find largely incoherent, but we need to […]