The solution to the VW scandal is to empower regulators and make sure they have access to any and all parts of the systems they oversee. The solution is not to open up those systems to everyone. There is no “right to repair,” at least for individuals. Whether or not it deserves to be called a “freedom,” the “freedom to tinker” is not a fundamental freedom. The suggestion that auto manufacturers be forced to open these systems is wrongheaded at best and disingenuous at worst. We have every reason to think that opening up those systems would make matters worse, not better.
Volkswagen’s manipulation of the software that runs the emissions control devices in its cars has rightly produced outrage, concern, and condemnation. Yet buried in the responses have been two very different lessons—lessons that may at first sound very similar, but on closer examination are as different as night and day. Some writers have not been careful to distinguish them, but this is a huge mistake, as they end up embodying two entirely different philosophies that in most important ways contradict each other.
The two philosophies will be familiar to my readers:
- The cyberlibertarian, cypherpunk, FOSS perspective: we can prevent future VWs by mandating all such software be available for inspection and even modification by users;
- The democratic response: regulators like the EPA should have access to proprietary code like that in VW vehicles.
These two responses may sound similar, but they are radically different. One suggests that the wisdom of the crowd will result in regulations and laws being followed. The other puts trust in bodies specifically chartered and licensed to enforce regulations and laws.
One says—and this is everywhere in the discussions of the topic—that it’s the fault of the Digital Millennium Copyright Act (DMCA) and EPA’s resistance to the proposed grant of an exemption for users to access automobile software, an exemption for which the Electronic Frontier Foundation was a principle advocate. It is hard to find an account of this story that does not excoriate EPA for opposing that exemption and even blame that response for the long period of time it took to uncover VW’s cheating. Contained within that sentiment is the inherent view that regulators can’t do their jobs and that ordinary citizen security researchers can and will do it better; this typical cyberlibertarian contempt and narcissistic lust-for-power is visible in much of the discourse adopting this view. This, by far, has been the dominant response to the story. The only real challenge to this narrative has come from the estimable Tom Slee, whose “Volkswagen, IoT, the NSA and Open Source Software: A Quick Note” is, along with Jürgen Geuter’s “Der VW-Skandal Zeigt unser Vertrauensproblem mit Software” (approximately: “The VW Scandal Demonstrates Our Confidence Problem with Software”) the best thing I’ve read on the whole scandal, and with which I am in strong agreement.
The other says that certain social functions are assigned to government. for good reason. From this perspective, one might want to look at the massive defunding of regulatory agencies and the political rejection of regulation engineered by not just the right in general but the digital technology industries themselves as a huge part of the problem. Critically, from this perspective, the DMCA just has nothing to do with this issue at all. Regulators can and do look inside of products that are covered by trade secrecy and other intellectual property agreements. They have to.
These aren’t just abstract differences. They embody fundamentally different ways of seeing the world, and in how we want the world to be organized.
I think the first view is incoherent. It says, on the one hand, we should not trust manufacturers like Volkswagen to follow the law. We shouldn’t trust them because people, when they have self-interest at heart, will pursue that self-interest even when the rules tell them not to. But then it says we should trust an even larger group of people, among whom many are no less self-interested, and who have fewer formal accountability obligations, to follow the law.
If anything, history shows the opposite. The more we make it optional to follow the law—and to be honest, the nature of an “optional law” is about as oxymoronic as they come, but it is at the core of much cyberlibertarian thought—the more we put law into the hands of those not specifically entrusted to follow it, the more unethical behavior we will have. Not less. That’s why we have laws in the first place—because without them, people will engage even more in the behavior we are trying to curtail.
Now consider the current case. What the cyberlibertarians want, even demand, is for everyone to have the power to read and modify the emissions software in their cars.
They claim that this will eliminate wrongdoing. In my opinion, and there is a lot of history to back this up, it will encourage and even inspire wrongdoing.
This is where the cyberlibertarian claim turns into pure fantasy, of a sort that underlies much of their thinking in general. Modifying cars has a significant history. You don’t need to go far to find it.
Show me the history of car owners modifying their cars to meet emissions and safety standards when they don’t otherwise meet them.
Because what I’ll show you is the overwhelming majority of car modifications, in so far as they deal with regulatory standards, are performed to bypass standards like those and many others.
You don’t have to read far in the automotive world to see how deeply car owners want to bypass those standards, in the name of performance and speed. You’d have to read much farther and much deeper to find evidence of automobile owners selflessly investigating whether or not their vehicles are meeting mandates.
Not only that: we don’t have to look far to find this pattern directly regarding diesel Volkswagens. A recent story on the VW scandal at the automotive-interest site The Truth About Cars notes that
the aftermarket community has released modifications for the DPF and Adblue SCR systems long before there was any talk of reduced power and economy coming from a potential fix for the emissions scandal. They looked to gain more power and better fuel economy by modifying or deleting the DPF system.
These “aftermarket tuner” kits like DPF and Adblue SCR have to be marketed “as off-road only as they violate federal emissions laws.” These are the selfless regulation-focused folks we should rely on to protect our environment? Seriously?
In fact, EPA has already studied the specific question of software modification to emissions systems (which is part of what makes me wonder whether those who have excoriated EPA’s response have actually read the letter):
Based on the information EPA has obtained in the context of enforcement activities, the majority of modifications to engine software are being performed to increase power and/or boost fuel economy. These kinds of modifications will often increase emissions from a vehicle engine, which would violate section 203(a) of the CAA (Clean Air Act), commonly known as the “tampering prohibition.” (2)
It is beyond ironic that this scandal has been taken to demonstrate that “we need to open up the Internet of Things,” or that “DMCA exemptions could have revealed Volkswagen fraud,” or that the scandal makes clear the “dangers of secret code.” I would argue that the lesson is entirely different: people will cheat. Making it easy for people to cheat means they will cheat more. Regulators need access to the code that runs things, but just because people will cheat, ordinary people should not have that access. They should not have access to the code that runs medical devices, to the code that runs self-driving cars, to the code that runs airplanes, or to the code that controls security systems in our houses.
Rather than showing that EPA was wrong to oppose the DMCA exception and that people like Eben Moglen are right about opening up proprietary software, we would do better to observe what he himself did about the elevator that the New York Times writes about in their paean to him and his work. That story begins and frames itself around a discussion of elevator safety. Here is the elevator anecdote in its entirety, from a 2010 talk by Moglen:
In the hotel in which I was staying here, a lovely establishment, but which I shall not name for reasons that will be apparent in a moment, there was an accident last week in which an elevator cable parted and an elevator containing guests in the hotel plummeted from the second story into the basement. When you check in at the hotel you merely see a sign that says “We are sorry that this elevator is not working. And we are apologetic about any inconvenience it may cause.” I know that the accident occurred because a gentleman I met in the course of my journey from New York to Edinburgh earlier this week was the employer of the two people who were in the car. And in casual conversation waiting for a delayed airplane the matter came out. I have not, I admit, looked into the question of elevator safety regulation in the municipality. But in every city in the world where buildings are tall (and they have been tall here in proportion to the environment for longer than they have in most parts of the world) elevators safety is a regulated matter, and there are periodic inspections and people who are independent engineers, working at least in theory for the common good, are supposed to conduct such tasks as would allow them to predict statistically that there is a very low likelihood of a fatal accident until the next regular inspection.
While it is taken as an argument for user access to the code that runs elevators, it is actually anything but. It is an argument for regulators having access to that code, period. I do not want the hackers in my building to have access to the elevator code, and neither should you. I do not want them to have access to the code in voting machines.
Moglen made this remarkable statement in the New York Times article:
If Volkswagen knew that every customer who buys a vehicle would have a right to read the source code of all the software in the vehicle, they would never even consider the cheat, because the certainty of getting caught would terrify them.
I don’t know about terror, but I would be distinctly concerned, as EPA is, that this “right” would mean a regime of emissions cheating by individuals that would not only far outflank what Volkswagen has apparently done, but, by dint of its being realized in a thousand different schemes for software alteration, would make those modifications virtually impossible to check. What is particularly striking is that this reasoning, which builds on obvious, well-understood facts, could be jettisoned in favor of an idealistic and obviously false view of human political conduct for which virtually no evidence can be generated.
In fact, to the degree that we have evidence, we know that the opposite is true: Linux, Android, and many other OS projects are routinely attacked by hackers, while the Apple iPhone operating system—contained in its famous “walled garden”—continues to be one of the safest software environments around. (Reports have indicated that up to 97% of all mobile malware is found on the open source Android system.) Contrast this to the “jailbroken” iOS software, which is pretty much the best way to ensure that your iPhone gets malware. (In fact, just this week we have the first-ever report of malware on iPhones that aren’t jailbroken.) Really? Opening things up protects us? Who’s zooming who?
Not only that: we have plenty of evidence that even in small, isolated cases that are critical to security and that thousands of coders care deeply about (I am specifically thinking of OpenSSL, as Geuter discusses in his article), open source still does not produce secure products—certainly no more secure than closed source does.
All of this should really raise questions for people about the motivations behind the demand that security software be opened: I think, just as in this case, that selfless desire to improve the world for everyone is at best what motivates some of those involve in this question. Just as prominent—perhaps more prominent—is an egotistical drive to control and to deny the legitimacy of any authority but oneself. That attitude is exactly the one that leads to regulation-flouting modifications and the production of malware, not to combating them. Like everything else in the world of the digital, it relies on an extremely individualistic, libertarian conception of the self and its relationship to society.
One additional thing that I find a bit dispiriting about this is that one of the best books to come out in recent years about digital politics, Frank Pasquale’s Black Box Society, is specifically focused on the question of technological and particularly algorithmic “black boxes.” Pasquale specifically argues that regulators must be given not just the power (some of which they already have, much of which—for example in the case of algorithms used by Facebook, Google, and Acxiom—they do not) but also the capability (which means resources) to see into these algorithmic systems that affect all of us. Pasquale makes a long and detailed argument and an impassioned plea for a “Federal Search Commission,” parallel to the FDA and EPA, that would be able to see into important technologies whether or not they are protected by trade secrets. Pasquale has been suggesting this for a very long time. He is among the most prominent legal theorists addressing these issues. How is it that when an event occurs that should cause at least some well-informed commentators to show how it validates that thought, virtually nobody does? And worse: the New York Times actually writes a story saying that the scandal validates the work of Eben Moglen, who might well be thought of as the political opposite of Pasquale—despite the fact that Moglen’s apparent version of this story makes very little sense and contradicts his own analysis of similar situations.
That is part of why cyberlibertarianism must be understood as an ideology, not an overt political program. Like all ideologies, it twists issues into parodies of themselves in order to advance its agenda, even when the facts point in exactly the opposite direction.
In a response to this piece on Twitter, Jürgen Geuter said that he read me to be saying that closed source is more secure than open source. I don’t mean to be saying that; I mean to make only the more modest claim that open source is not inherently more secure than closed source. As for which is more secure, I am not sure that question has an answer on the open/closed axis, but I really don’t know. In fact I take that to be part of the lesson of Geuter’s excellent recent piece in Wired Germany that I link to above. The number of people who can accurately evaluate any project for its total security profile is somewhere between “very small” and “nonexistent.” The Android vs. iOS example I use is meant only to show that open source projects are not inherently more secure; there is much more to that story than open vs. closed, and of course iOS is itself at least partly based on the Free Software Unix operating system BSD, as are other Apple operating systems. But it is fair to say that Apple’s “walled garden” has long been a target of ire from the developer community, and has yet been one of the most secure platforms available–at least so far. I do draw from this fact the conclusion that the demands by FOSS advocates that all systems be opened because it will make them more secure are at best unfounded and at worst dishonest–dishonest because a significant number of people in those communities want that access not to increase security but specifically to learn how to defeat it more easily. And I do think, whether it makes the systems as a totality less secure or not, that exposing the complete internals of systems to everyone gives attackers an informational advantage no matter how you slice it.