Most of you would have heard about the giant scandal where Volkswagen (VW) put software in its cars to deliberately cheat on emissions tests in the USA and possibly other places. It’s very bad for VW, but what does it mean for all robocar efforts?
You can read tons about the Volkswagen emissions violations, but here’s a short summary. All modern cars have computer controlled fuel and combustion systems, and these can be tuned for different levels of performance, fuel economy and emissions. (Of course, ignition in a diesel is not done by an electronic spark.) Cars have to pass emission tests, so most cars have to tune their systems in ways that reduce other things (like engine performance and fuel economy) in order to reduce their pollution. Most cars attempt to detect the style of driving going on, and tune the engine differently for the best results in that situation.
VW went far beyond that. Apparently its system was designed to detect when it was in an emissions test. In these tests, the car is on rollers in a garage, and it follows certain patterns. VW set their diesel cars to look for this, and tune the engine to produce emissions below the permitted numbers. When the car saw it was in more regular driving situations, it switched the tuning to modes that gave it better performance and better mileage but in some cases vastly worse pollution. A commonly reported number is that in some modes 40 times the California limit of Nitrogen Oxides could be emitted, and even over a wide range of driving it was as high as 20 times the California limit (about 5 times the European limit.) NOx are a major smog component and bad for your lungs.
It has not been revealed just who at VW did this, and whether other car companies have done this as well. (All companies do variable tuning, and it’s “normal” to have modestly higher emissions in real driving compared to the test, but this was beyond the pale.) The question everybody is asking is “What the hell were they thinking?”
Beyond that, VW has seriously reduced the trust customers and governments will place not just in its own company, but in car makers in general, and in software offerings in particular. VW will lose trust, but this will spread to all German carmakers and possibly all carmakers. This could result in reduced trust in the software in robocars.
Is there a Connection to Self-Driving Cars?
It’s not too surprising companies might cheat to improve the bottom line, especially when they convince themselves they won’t get caught. Where does that leave the untrustworthy robocar maker?
My prediction is robocar vendors will end up self-insuring their vehicle fleets, at least while the software is driving. Conventional insurance in Pay-As-You-Drive (PAYD) mode may apply to miles driven with a human at the wheel. The vendors or fleet operators may purchase reinsurance to cover major liabilities, but will do so with a very specific contract with the underwriter which won’t protect them in the event of actual fraud.
If they self-insure, they have zero interest in cheating on safety. If they don’t make a car safe enough, they will be responsible for the cost of every accident. There will be nobody to cheat but themselves, though the pain of injuries that goes beyond what a court awards still needs to be considered. One reason for self-insurance, in fact, is you will actually feel safer getting in a car, knowing it is the vendor who stands to lose if it is not safe enough.
Of course, in the event of accidents, vendors will work as hard as possible to avoid liability, but this comes at a cost of its own.
Cheats are far more likely if they benefit customers and increase sales. Examples might be ways that cars might break traffic laws in order to get you to places faster. Cars might park (actually “stand”) where they should not. Already there are cars with a dial that lets the occupant/controller adjust the speed above the speed limit, and in fact these dials are necessary. There has been lots of recent discussion on other ways it is necessary to not strictly observe the law in order to drive well on US roads.
One can imagine a number of other tricks that are not specific to robocars. Cars might try to cheat you on the bill for a taxi ride (just as cab drivers are known to deliberately take bad routes to get more money sometimes.)
VW/Audi have had some decent robocar projects, and VW’s sponsorship of Stanford’s VAIL lab has provided a lot of that. Now we must downgrade VW as a vendor that customers will trust. (There is some irony to that of course, since at this point, VW is probably the least likely company to cheat going forward.)
Would Suppliers Lie?
There may be more risk of suppliers of technology for robocars, such as sensors, being untruthful about their abilities or more likely their reliability. While the integrators will be inherently distrustful, as they will take the liability, one can see smaller vendors telling lies if they see it as the only way to get a big sale necessary for their business. While they would end up liable if caught, they might not have the resources to pay for that liability, and be more interested in making the big time in the hope of not being caught.
This risk is why the car industry tends to only buy from huge suppliers known as “tier 1” companies. The smaller suppliers, in tiers 2 and 3, aren’t allowed to sell to big auto OEMs, because big auto companies won’t bet a car line on a small company. Instead, the small companies have to partner with a tier 1 that takes on that responsibility – and, of course, a chunk of the profits.
On the plus side, robocar designs generally expect parts and sensors to all fail from time to time, and so a good car design plans for failure and survives it safely with the ability at the very least to safely pull off the road – another car will be on the way quickly. Most tools do not plan, however, on how to deal with a sensor that might deliberately provide false information, other than in planning defense against computer intrusion, which might turn a component into a malicious and untrustworthy device.
But people are thinking about this, which can give us some comfort on the question of fraud by a supplier.
Self-certification
This scandal will probably raise more questions about the popular (and still probably correct) approach of having vendors self-certify that they have attained functional safety goals for their systems. These are actually unrelated issues. VW was not self-certifying, it was going through a government certification process, and cheating on it. If anything it actually reduces trust in government certification approaches. However, in the public eye, the reduced trust in vendors will extend to everything they do, including self-certification.
The reasoning above about motives has been applied to self-certification. Vendors (at least ones of size and reputation) have strong motives not to lie on self-certification, both because they are liable for the safety failures that are their fault, and because they will be extra liable with possible punitive damages if they deliberately lied.
I have a longer article with more debate on the issues around government regulation and certification of robocars.
Editor’s Note: This article first appeared on Brad Templeton’s Robocars Blog.