Comma.ai, which previously gave up on its Comma One self-driving car system due to NHTSA threats, has changed gears. It has open sourced new software called Open Pilot, along with hardware designs and case designs for the Comma Neo.
The code for Open Pilot, which Comm.ai is calling “open source alternative to [Tesla’s] Autopilot,” is available on GitHub. NHTSA did not want Comma.ai making an autopilot, and said the company could not simply rely on the fact that drivers were told they must be diligent.
It will be very interesting to see how NHTSA reacts to the release of open designs that anybody can then install on their car.
The automotive industry has had a long history of valuing the tinkerer. All the big car companies had their beginnings with small tinkerers and inventors. Some even died in the very machines they were inventing. These beginnings have allowed people to do all sorts of playing around in their garages with new car ideas, without government oversight, in spite of the risk to themselves and even others on the road. If a mechanic wants to charge you for working on your car, they must be licensed, but you are free to work on it yourself with no licence, and even build experimental cars. You just can’t sell them. And even those rights have been eroded.
Clearly, far fewer people will have the inclination to build an autopilot using Comma.ai’s tools by themselves. But it won’t be that hard to do, and it will become even easier with time. One could even imagine a car which already had the necessary hardware, so that you only needed to download software to make it happen.
In recent times, there has been a strong effort to prevent people with tinkering with their cars, even in software. One common area of controversy has been around engine tuning. Engine tuning is regulated by the EPA to keep emissions low. Car vendors have to show they have done this – and they can’t program their car to give good emissions only on the test while getting better performance off the test as VW did. But owners have been known to want to make such modifications. Now we will see modifications that affect not just emissions but safety.
Car companies don’t want to be responsible if you modify the code in your car and there is an accident involving both their code and yours. As such, they will try to secure their car systems so you can’t change them, and the government may help them or even insist on it. When you add computer security risks to the mix – who can certify the modified car can’t be taken over and used as a weapon? – it will get more fun.
I suspect Comma.ai’s approach would not know what to do about the collapsed road, because it would never have been trained in that situation. It might, however, simply sound an alert and kick out, not being able to find the lane any more.
Editor’s Note: This article was republished from Brad Templeton’s Robocars blog.