How do good people go so wrong? What can we do about it? Lessons from the Volkswagen fraud

Dozens if not hundreds of Volkswagen employees must have known of the recently detected emissions software fraud  that continued for six years and affected 11 million cars around the world.

And yet the fraud was not revealed by a whistle-blower; it was discovered by outsiders.  How could Volkswagen management have coerced so many otherwise reasonable, ethical people into silent cooperation over so long a time?  Or did they deliberately hire only crooks?

I’m writing about the Volkswagen fraud in this blog because it raises two big questions relevant to elections: First, is it plausible that voting machine companies could perpetrate a similar sort of fraud?  Second, is it plausible that their customers, local election officials, wouldn’t notice?

ComputerFraudCircuitBoard.jpgNow, I don’t run around claiming fraud I cannot prove. I have a natural-born skeptic’s aversion to making accusations without conclusive evidence. Years working at Wisconsin’s Legislative Audit Bureau—an investigative and oversight agency--only sharpened that instinct.  At the same time, my hard-wired respect for hard facts also makes me dubious about election officials’ claims that they can ensure 100% correct results every time without routine verification. No other computer-dependent manager makes such an extraordinary demand on my trust.  I am acutely aware that when it comes to electronic elections fraud, the only elections we are sure were not miscounted (arguably) are the very few that were recounted. Considering what is at stake, that is a very serious problem.

Back to the questions. First, with regard to the voting-machine companies, the Volkswagen affair clearly demonstrates the plausibility of systemic fraud by companies that manufacture, sell, or service computerized equipment, whether it’s cars or voting machines. Fraud is particularly feasible when they demand proprietary secrecy for their software, and when they sell their product to consumers who have no particular information technology expertise--like local government election officials.  

For example, a voting-machine company could routinely install clandestine wireless communications capability on each of its voting machines, in case the company ever feels the need to fix a bug (or an election) without the knowledge of local election officials. The vendors know that local officials never inspect the equipment in any way that might discover the chip, because they don’t want to appear to be ‘tampering with the machines.’ (See footnote.)  And IT experts tell us that is only one of several ways voting-machine company insiders could alter election results if they chose to.

With regard to the second question—the local election officials’ willingness to allow possible fraud to go undetected: That’s readily answered. We know they are willing.  They grant vendors proprietary secrecy for their software, so no election official ever inspects the vote-tabulating software loaded into any machine. And yet, knowing they did not and cannot inspect the software, the vast majority of jurisdictions—including all Wisconsin counties—routinely neglect to verify the accuracy of electronically tabulated results. My previous post in this blog described the puzzling refusal of many officials even to learn about their practical options for detecting miscounts.

But why?

In a well-researched and well-written article in the most recent Atlantic Magazine, What Was Volkswagen Thinking?, business journalist Jerry Useem provides insight into both the corporations and their customers.

Using examples of actual corporate conduct both good and bad, the article explains how otherwise ethical employees of a company, such as a voting-machine company, could go for years without saying anything about their machines’ hack-ability, and why otherwise responsible election officials could choose to trust the machines’ Election-Day accuracy based on nothing more than a determined will to believe.

People within an organization, Useem explains, fall prey to “a cultural drift in which circumstances classified as not okay are slowly reclassified as okay.” In response to pressures and expectations in their workplace, otherwise competent people come to see careless conduct as admirable flexibility. They begin to tolerate excessive risk-taking as if it was creative experimentation. Withholding damaging information starts to feel like prudent discretion. Actual fraud is mentally redefined to be practical adaptation to circumstances.

The derailment usually starts at the top. Useen notes the obvious: in many cases, management is simply dishonest, as seems to have been the case with Volkswagen, and as many election-integrity activists fear when they point to voting machine companies owned by active partisans.

But management might be honest and just as genuinely self-deluded as their employees. Upper-level management might have made ‘fantastic commitments’ without being aware that the company’s employees could not fulfill them without cheating. From my own experience with contracting and procurement, I also suspect impetus to cheat can arise from customers’ demands.

For example, one Wisconsin county clerk has written he does not allow updates or patches to that county’s voting-machine software. If this is true, it’s easy to see how the vendor would be tempted to install wireless communications capability in that county’s voting machines. And why not? The company needs occasionally to update and patch the software, and the county clerk will never discover the communications chip. It would be the only way the company could keep the machines up to date without upsetting the clerk. And once that chip is there, every employee who knows about it is on nothing stronger than an honor system not to alter the election results, because output isn't routinely verified in Wisconsin.

Once expectations are clear, employees will try to conform their behavior. And that behavior is comfortable only when they also bring their perceptions and beliefs into compliance.

For example, an in-depth study following the Challenger space-shuttle explosion found that engineers had observed O-ring failure--the cause of the disaster--in many previous tests; had reported it; and had repeatedly been pressured to endorse more risk as acceptable. Compliantly, they had rewritten their reports to approve looser and looser standards. Then, just before the fatal flight, freezing temperatures convinced the engineers to issue a “No Launch” recommendation. However:

“The data they faxed to NASA to buttress (their no-launch recommendation) were the same data they had earlier used to argue that the space shuttle was safe to fly. NASA pounced on the inconsistency. (Faced with the) script they themselves had built in the preceding years, (the engineers) buckled. The “no-launch” recommendation was reversed to “launch.”

Useem explains that the engineers and managers “were not merely acting as if nothing was wrong. They believed it, bringing to mind Orwell’s concept of doublethink, the method by which a bureaucracy conceals evil not only from the public but from itself.”

That could explain why so many otherwise responsible election officials seem to shut down when you try to talk to them about examining voting-machine output for accuracy. We look at them and think, “You’ve got so very much invested in the elections—all those security measures, all those pre-election tests--how could you not want to check to make sure everything came out right on Election Day?”

But they look at all their hard work and think, “I promised everyone it would work. My career will be over if it doesn’t work. It’s got to work. I’m sure it worked.” Useem explains what happens next:

Even without stress, people tend to underestimate the probability of future bad events. Put them under emotional stress ... and this tendency gets amplified. People will favor decisions that preempt short-term social discomfort even at the cost of heightened long-term risk.

So how do we help election officials break out of this mindset? How do we increase their fear of a ‘future bad event’ that could cause them ‘short-term social discomfort’?

Oddly enough, it is not the risk of a miscounted election occurring that causes their fear; it is the risk of a miscount being detected. Most computer-dependent managers--not election officials--realistically expect that if they do not catch a computer error, someone else will. For them, routine audits reduce fear. But miscounting voting machines don’t blow up like space shuttles, and candidates cannot take voting machines out for test drives between elections. Therefore, in the absence of post-election verification, election officials have no realistic fear of anyone discovering a miscount. The idea of an audit has the opposite effect on them: their fear goes from nothing to, well, something.

That’s why transparent, high-quality citizens’ audits are critical. Even if they cannot be completed before the identified winners are sworn into office, they will still serve a useful purpose of increasing election officials’ realistic expectation (okay, fear) that any miscounts will be detected—if not by them, by someone else.  That will create for them the same discomfort felt by bankers, city treasurers, grocery store owners, and all the other computer-dependent managers who fear embarrassment or worse if they don’t notice the computer error and correct it before someone else does.

---

Note:   I know of no requirement that anyone inspect Wisconsin's voting machines to verify they have no wireless communications capability. Over the years, I’ve asked several Wisconsin election officials if they inspect the machines anyway and have not had any tell me that they do. If anyone reading this has any first-hand knowledge of any local officials’ inspection practices that include checking for wireless communications capability, please email me at WiscElectionIntegrity@gmail.com.

Be the first to comment

Please check your e-mail for a link to activate your account.

get updates