The Volkswagen Scandal: The DMCA Is Not the Problem and Open Source Is Not the Solution

tl;dr

The solution to the VW scandal is to empower regulators and make sure they have access to any and all parts of the systems they oversee. The solution is not to open up those systems to everyone. There is no “right to repair,” at least for individuals. Whether or not it deserves to be called a “freedom,” the “freedom to tinker” is not a fundamental freedom. The suggestion that auto manufacturers be forced to open these systems is wrongheaded at best and disingenuous at worst. We have every reason to think that opening up those systems would make matters worse, not better.

full post

Volkswagen’s manipulation of the software that runs the emissions control devices in its cars has rightly produced outrage, concern, and condemnation. Yet buried in the responses have been two very different lessons—lessons that may at first sound very similar, but on closer examination are as different as night and day. Some writers have not been careful to distinguish them, but this is a huge mistake, as they end up embodying two entirely different philosophies that in most important ways contradict each other.

The two philosophies will be familiar to my readers:

  • The cyberlibertarian, cypherpunk, FOSS perspective: we can prevent future VWs by mandating all such software be available for inspection and even modification by users;
  • The democratic response: regulators like the EPA should have access to proprietary code like that in VW vehicles.

These two responses may sound similar, but they are radically different. One suggests that the wisdom of the crowd will result in regulations and laws being followed. The other puts trust in bodies specifically chartered and licensed to enforce regulations and laws.

One says—and this is everywhere in the discussions of the topic—that it’s the fault of the Digital Millennium Copyright Act (DMCA) and EPA’s resistance to the proposed grant of an exemption for users to access automobile software, an exemption for which the Electronic Frontier Foundation was a principle advocate. It is hard to find an account of this story that does not excoriate EPA for opposing that exemption and even blame that response for the long period of time it took to uncover VW’s cheating. Contained within that sentiment is the inherent view that regulators can’t do their jobs and that ordinary citizen security researchers can and will do it better; this typical cyberlibertarian contempt and narcissistic lust-for-power is visible in much of the discourse adopting this view. This, by far, has been the dominant response to the story. The only real challenge to this narrative has come from the estimable Tom Slee, whose “Volkswagen, IoT, the NSA and Open Source Software: A Quick Note” is, along with Jürgen Geuter’s “Der VW-Skandal Zeigt unser Vertrauensproblem mit Software” (approximately: “The VW Scandal Demonstrates Our Confidence Problem with Software”) the best thing I’ve read on the whole scandal, and with which I am in strong agreement.

The other says that certain social functions are assigned to government. for good reason. From this perspective, one might want to look at the massive defunding of regulatory agencies and the political rejection of regulation engineered by not just the right in general but the digital technology industries themselves as a huge part of the problem. Critically, from this perspective, the DMCA just has nothing to do with this issue at all. Regulators can and do look inside of products that are covered by trade secrecy and other intellectual property agreements. They have to.

These aren’t just abstract differences. They embody fundamentally different ways of seeing the world, and in how we want the world to be organized.

I think the first view is incoherent. It says, on the one hand, we should not trust manufacturers like Volkswagen to follow the law. We shouldn’t trust them because people, when they have self-interest at heart, will pursue that self-interest even when the rules tell them not to. But then it says we should trust an even larger group of people, among whom many are no less self-interested, and who have fewer formal accountability obligations, to follow the law.

If anything, history shows the opposite. The more we make it optional to follow the law—and to be honest, the nature of an “optional law” is about as oxymoronic as they come, but it is at the core of much cyberlibertarian thought—the more we put law into the hands of those not specifically entrusted to follow it, the more unethical behavior we will have. Not less. That’s why we have laws in the first place—because without them, people will engage even more in the behavior we are trying to curtail.

Now consider the current case. What the cyberlibertarians want, even demand, is for everyone to have the power to read and modify the emissions software in their cars.

They claim that this will eliminate wrongdoing. In my opinion, and there is a lot of history to back this up, it will encourage and even inspire wrongdoing.

This is where the cyberlibertarian claim turns into pure fantasy, of a sort that underlies much of their thinking in general. Modifying cars has a significant history. You don’t need to go far to find it.

Show me the history of car owners modifying their cars to meet emissions and safety standards when they don’t otherwise meet them.

VW Super Beetle (1972)

VW Super Beetle (1972). Image source: conceptcarz.com.

Because what I’ll show you is the overwhelming majority of car modifications, in so far as they deal with regulatory standards, are performed to bypass standards like those and many others.

You don’t have to read far in the automotive world to see how deeply car owners want to bypass those standards, in the name of performance and speed. You’d have to read much farther and much deeper to find evidence of automobile owners selflessly investigating whether or not their vehicles are meeting mandates.

Not only that: we don’t have to look far to find this pattern directly regarding diesel Volkswagens. A recent story on the VW scandal at the automotive-interest site The Truth About Cars notes that

the aftermarket community has released modifications for the DPF and Adblue SCR systems long before there was any talk of reduced power and economy coming from a potential fix for the emissions scandal. They looked to gain more power and better fuel economy by modifying or deleting the DPF system.

These “aftermarket tuner” kits like DPF and Adblue SCR have to be marketed “as off-road only as they violate federal emissions laws.” These are the selfless regulation-focused folks we should rely on to protect our environment? Seriously?

In fact, EPA has already studied the specific question of software modification to emissions systems (which is part of what makes me wonder whether those who have excoriated EPA’s response have actually read the letter):

Based on the information EPA has obtained in the context of enforcement activities, the majority of modifications to engine software are being performed to increase power and/or boost fuel economy. These kinds of modifications will often increase emissions from a vehicle engine, which would violate section 203(a) of the CAA (Clean Air Act), commonly known as the “tampering prohibition.” (2)

It is beyond ironic that this scandal has been taken to demonstrate that “we need to open up the Internet of Things,” or that “DMCA exemptions could have revealed Volkswagen fraud,” or that the scandal makes clear the “dangers of secret code.” I would argue that the lesson is entirely different: people will cheat. Making it easy for people to cheat means they will cheat more. Regulators need access to the code that runs things, but just because people will cheat, ordinary people should not have that access. They should not have access to the code that runs medical devices, to the code that runs self-driving cars, to the code that runs airplanes, or to the code that controls security systems in our houses.

Rather than showing that EPA was wrong to oppose the DMCA exception and that people like Eben Moglen are right about opening up proprietary software, we would do better to observe what he himself did about the elevator that the New York Times writes about in their paean to him and his work. That story begins and frames itself around a discussion of elevator safety. Here is the elevator anecdote in its entirety, from a 2010 talk by Moglen:

In the hotel in which I was staying here, a lovely establishment, but which I shall not name for reasons that will be apparent in a moment, there was an accident last week in which an elevator cable parted and an elevator containing guests in the hotel plummeted from the second story into the basement. When you check in at the hotel you merely see a sign that says “We are sorry that this elevator is not working. And we are apologetic about any inconvenience it may cause.” I know that the accident occurred because a gentleman I met in the course of my journey from New York to Edinburgh earlier this week was the employer of the two people who were in the car. And in casual conversation waiting for a delayed airplane the matter came out. I have not, I admit, looked into the question of elevator safety regulation in the municipality. But in every city in the world where buildings are tall (and they have been tall here in proportion to the environment for longer than they have in most parts of the world) elevators safety is a regulated matter, and there are periodic inspections and people who are independent engineers, working at least in theory for the common good, are supposed to conduct such tasks as would allow them to predict statistically that there is a very low likelihood of a fatal accident until the next regular inspection.

While it is taken as an argument for user access to the code that runs elevators, it is actually anything but. It is an argument for regulators having access to that code, period. I do not want the hackers in my building to have access to the elevator code, and neither should you. I do not want them to have access to the code in voting machines.

Moglen made this remarkable statement in the New York Times article:

If Volkswagen knew that every customer who buys a vehicle would have a right to read the source code of all the software in the vehicle, they would never even consider the cheat, because the certainty of getting caught would terrify them.

I don’t know about terror, but I would be distinctly concerned, as EPA is, that this “right” would mean a regime of emissions cheating by individuals that would not only far outflank what Volkswagen has apparently done, but, by dint of its being realized in a thousand different schemes for software alteration, would make those modifications virtually impossible to check. What is particularly striking is that this reasoning, which builds on obvious, well-understood facts, could be jettisoned in favor of an idealistic and obviously false view of human political conduct for which virtually no evidence can be generated.

In fact, to the degree that we have evidence, we know that the opposite is true: Linux, Android, and many other OS projects are routinely attacked by hackers, while the Apple iPhone operating system—contained in its famous “walled garden”—continues to be one of the safest software environments around. (Reports have indicated that up to 97% of all mobile malware is found on the open source Android system.) Contrast this to the “jailbroken” iOS software, which is pretty much the best way to ensure that your iPhone gets malware. (In fact, just this week we have the first-ever report of malware on iPhones that aren’t jailbroken.) Really? Opening things up protects us? Who’s zooming who?

Not only that: we have plenty of evidence that even in small, isolated cases that are critical to security and that thousands of coders care deeply about (I am specifically thinking of OpenSSL, as Geuter discusses in his article), open source still does not produce secure products—certainly no more secure than closed source does.

All of this should really raise questions for people about the motivations behind the demand that security software be opened: I think, just as in this case, that selfless desire to improve the world for everyone is at best what motivates some of those involve in this question. Just as prominent—perhaps more prominent—is an egotistical drive to control and to deny the legitimacy of any authority but oneself. That attitude is exactly the one that leads to regulation-flouting modifications and the production of malware, not to combating them. Like everything else in the world of the digital, it relies on an extremely individualistic, libertarian conception of the self and its relationship to society.

One additional thing that I find a bit dispiriting about this is that one of the best books to come out in recent years about digital politics, Frank Pasquale’s Black Box Society, is specifically focused on the question of technological and particularly algorithmic “black boxes.” Pasquale specifically argues that regulators must be given not just the power (some of which they already have, much of which—for example in the case of algorithms used by Facebook, Google, and Acxiom—they do not) but also the capability (which means resources) to see into these algorithmic systems that affect all of us. Pasquale makes a long and detailed argument and an impassioned plea for a “Federal Search Commission,” parallel to the FDA and EPA, that would be able to see into important technologies whether or not they are protected by trade secrets. Pasquale has been suggesting this for a very long time. He is among the most prominent legal theorists addressing these issues. How is it that when an event occurs that should cause at least some well-informed commentators to show how it validates that thought, virtually nobody does? And worse: the New York Times actually writes a story saying that the scandal validates the work of Eben Moglen, who might well be thought of as the political opposite of Pasquale—despite the fact that Moglen’s apparent version of this story makes very little sense and contradicts his own analysis of similar situations.

That is part of why cyberlibertarianism must be understood as an ideology, not an overt political program. Like all ideologies, it twists issues into parodies of themselves in order to advance its agenda, even when the facts point in exactly the opposite direction.

Postscript

In a response to this piece on Twitter, Jürgen Geuter said that he read me to be saying that closed source is more secure than open source. I don’t mean to be saying that; I mean to make only the more modest claim that open source is not inherently more secure than closed source. As for which is more secure, I am not sure that question has an answer on the open/closed axis, but I really don’t know. In fact I take that to be part of the lesson of Geuter’s excellent recent piece in Wired Germany that I link to above. The number of people who can accurately evaluate any project for its total security profile is somewhere between “very small” and “nonexistent.” The Android vs. iOS example I use is meant only to show that open source projects are not inherently more secure; there is much more to that story than open vs. closed, and of course iOS is itself at least partly based on the Free Software Unix operating system BSD, as are other Apple operating systems. But it is fair to say that Apple’s “walled garden” has long been a target of ire from the developer community, and has yet been one of the most secure platforms available–at least so far. I do draw from this fact the conclusion that the demands by FOSS advocates that all systems be opened because it will make them more secure are at best unfounded and at worst dishonest–dishonest because a significant number of people in those communities want that access not to increase security but specifically to learn how to defeat it more easily. And I do think, whether it makes the systems as a totality less secure or not, that exposing the complete internals of systems to everyone gives attackers an informational advantage no matter how you slice it.

This entry was posted in "hacking", cyberlibertarianism, materality of computation and tagged , , , , , , , , , , , , , , , , , . Bookmark the permalink. Post a comment or leave a trackback: Trackback URL.

3 Comments

  1. wgreenhouse
    Posted October 7, 2015 at 2:24 pm | Permalink

    To read this piece without context, one might’ve supposed that the “defeat device” was discovered by a diligent employee of the US EPA, the California Air Resources Board, or one of the 27 EU members’ environmental regulatory agencies, or that discovery of the device was somehow prevented by trade secret or other IP law preventing such agencies from decompiling and studying the Volkswagen TDI cars’ engine management systems. In fact, none of this is true.

    In fact, the “defeat device” was discovered by a civil society group, the International Council on Clean Transportation, which exists to help consumers evaluate their purchasing choices and to lobby governments on their environmental regulations for vehicles. A lobbyist did the homework of dozens of governments. Indeed, the claims about chronic underfunding of environmental agencies are true, but that’s still completely shameful.

    And no access to the source code was required to achieve this feat. All that was required was running real-world performance tests on the vehicles, as opposed to the usual dynamometer or “rolling road” testing. In other words, the defeat device was not discovered by analysis of software, but by reverse engineering–determining the nature of the program by its behavior under various conditions (an act which could actually incur liability against the ICCT under various countries’ copyright and patent regimes, but probably will not in this case because of the positive media exposure).

    Surprisingly for someone who generally has a good eye for both fallacious technological solutionism and ideological claims not backed by evdence, Professor Golumbia manages to fall for a technological solutionist fix (give these overworked and underfunded environmental regulators access to source code, which will somehow help despite the fact that software’s behavior under all possible real-world conditions is a highly unsolved problem) and makes an ideological claim (only governments must be trusted to enforce environmental regulations) out of a fact pattern which actually shows the opposite, all at the same time.

    Civil society organizations and members of the public can help governments to enforce the law by investigating how software behaves under real-world conditions. In fact, not only can they, but the VW case is a literal example of That Exact Thing. Further, the source code issue and the question of whether things like cars or security systems should be protected by DRM or copyright licenses is moot, because, as the ICCT itself demonstrated, whether your goal is regulatory compliance or noncompliance, you can generally find out what you need to know by reverse-engineering.

    (Apologies for prior comment: it would help if I could link correctly.)

    • David Golumbia
      Posted October 7, 2015 at 2:47 pm | Permalink

      I don’t disagree for one moment about civil society organizations and I thank you for pointing that out. I don’t see that any of the pieces I am responding to say anything about them. My piece is explicitly framed (the title makes this clear) as a response to the pieces that said the DMCA was the problem and open-sourcing the emissions controller code to everyone would have been the solution. I did not mean to be speaking to how this problem was uncovered. Your account of that is completely in line with the ones I’ve read. I am 100% in favor of appropriate bodies, governmental and non-governmental, having access to whatever is deemed legally appropriate and necessary, the determination of which is a process that usually involves regulators and industry and even civil society groups at some level deciding who and what is appropriate. To take a slightly different example, while I disagree with the idea that there is a “right” to repair for individuals, as the EPA letter makes clear, licensed automobile repair shops get access to a great deal of the code in current automobiles. That is fine with me. As I read the letter, and I may be wrong, no DMCA exemption was required for that–that was all hammered out between EPA, the auto manufacturers, and the repair shops.

  2. wgreenhouse
    Posted October 7, 2015 at 3:09 pm | Permalink

    That’s true that repair shop diagnostics generally do not require a DMCA exemption; the problem is that they are generally governed by private law, that is a software license governing what the repair shop can use the code for, or an expensive single-purpose computer embodying the product of such a license for one make of car. This is probably an antitrust issue as the same carmakers have an interest in driving competitor repair shops out of business so you have to go to the dealer, and for the civil society geoups, it’s also a perverse incentive: a VW might eant to strain their budgets so that a full readout of the car’s diagnostic data is economically out of reachbecause it requires expensive software or hardware.

    Perhaps this isn’t a case for the Librarian of Congress in their role in granting copyright exemptions, but under current circumstances it’s basically in the exclusive power of the manufacturers. Barring of course ever-possible reverse engineering. And of course the automakers don’t care about keeping their wares out of the hands of modders; a licensed state inspection station can buy the diagnostic gear but so can a rich kid who wants to change his Honda’s fuel-air mix for his next illegal street race.

Post a Comment

You must be logged in to post a comment.