A Tesla blog post describes the first fatality involving a self-driving system. A Tesla was driving on Autopilot down a divided highway. For unknown reasons, a truck crossed the highway (something may have been wrong for that to happen.)
A white truck body against a bright sky is not something the camera system in the Tesla perceives well, and a truck crossing perpendicular to you on the highway is also an unusual situation.
The truck was also high, so when the Tesla did not stop, it went “under” it, so that the windshield was the first part of the Tesla to hit the truck body, with fatal consequences for the “driver.” Tesla points out that the Autopilot system has driven 130 million miles, while human drivers in the USA have a fatality about every 94 million miles (though it’s a longer interval on the highway.)
Video: Aftermath of Deadly Tesla Crash
Tesla Autopilot is a “supervised” system where the driver is required to agree they are monitoring the system and will take control in the event of any problem, but this driver, a major Tesla fan named Joshua Brown, did not hit the brakes. As such, the fault for this accident will presumably reside with Brown, or whatever caused the truck to be crossing a highway lane in this fashion.
Tesla also notes that had the front of the car hit the truck, the crumple zones and other safety systems would probably have saved the driver – hitting a high target is the worst case situation.
It is worth noting that Brown was a major Tesla fan, and in fact, he is the person in a video (watch below) that circulated in April claiming that the Autopilot saved him from a collision with a smaller truck that cut into a lane.
Any commentary here is preliminary until more facts are established, but here are my initial impressions:
- There has been much speculation of whether Tesla was taking too much risk by releasing Autopilot so early, and this will be boosted after this.
- In particular, a core issue is that the Autopilot works too well, and I have seen reports from many Tesla drivers of them trusting it far more than they should. The Autopilot is fine if used as Tesla directs, but the better it gets, the more it encourages people to over-trust it.
- Both Tesla stock and MobilEye stock were up today, with a bit of downturn after-hours. The market may not have absorbed this. The MobilEye is the vision sensor used by the Tesla to power Autopilot, and the failure to detect the truck in this situation is a not-unexpected result for the sensor.
- For years, I have frequently heard it said that “the first fatality with this technology will end it all, or set the industry back many years.” My estimation is that this will not happen.
- One report suggests the truck was making a left turn, which is a more expected situation, though if a truck turned with oncoming traffic it would be at fault.
- Another report suggests “friends” claim that the driver often used his laptop while driving, and the truck driver claims he heard the car playing a Harry Potter movie after it crashed.
Tesla’s claim of 130M miles is a bit misleading, because most of those miles actually were supervised by humans. So that’s like reporting the record of student drivers with a driving instructor always there to take over. And indeed there are reports of many, many people taking over for the Tesla Autopilot, as Tesla says they should. So at best Tesla can claim that the supervised Autopilot has a similar record to human drivers, ie. is no better than the humans on their own. Though one incident does not a driving record make.
Joshua Brown, the first person to die in a self-driving car accident, was a major Tesla fan. (Photo: Facebook)
Camera vs. Lidar, and Maps
I have often written about the big question of cameras vs. LIDAR. Elon Musk is famously on record as being against LIDAR, when almost all robocar projects in the world rely on LIDAR. Current LIDARs are too expensive for production automobiles, but many companies, including Quanergy (where I am an advisor) are promising very low cost LIDARs for future generations of vehicles.
Here there is a clear situation where LIDAR would have detected the truck. A white truck against the sky would be no issue at all for the LIDAR, it would see it very well. In fact, a big white target like that would be detected beyond the normal range of a typical LIDAR. That range is an issue here – most LIDARs would only detect other cars about 100m out, but a big white truck would be detected a fair bit further. Either way, that’s not quite far enough to stop in time for an obstacle like this at highway speeds, however, the car would brake to make the impact vastly less, and a clever car might even have had time to swerve or deliberately turn to hit the wheels of the truck rather than slide underneath the body.
Another sensor that is problematic here is radar. Radar would have seen this truck no problem, but since it was perpendicular to the travel of the car, it would not be moving away from or towards the car, and thus have the doppler speed signature of a stopped object. Radar is great because it tracks the speed of obstacles, but because there are so many stationary objects, most radars have to just disregard such signals – they can’t tell a stalled vehicle from a sign, bridge or berm.
To help with that, a map of where all the radar reflectors are located can help. If you get a sudden bright radar return from a truck or car somewhere that the map says a big object is not present, that’s an immediate sign of trouble. (At the same time, it means that you don’t easily detect a stalled vehicle next to a bridge or sign.)
One solution to this is longer range LIDAR or higher resolution radar. Google has said it has developed longer range LIDAR. It is likely in this case that even regular range LIDAR, or radar and a good map, might have noticed the truck.
A Tesla Model S, driving in Autopilot mode. (Photo Credit Jasper Juinen/Bloomberg)
Tesla Statement on the Accident
Here’s Tesla’s full statement on the fatal accident:
“We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.
“Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.
“It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
“We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.
“The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.”
NHTSA Statement
The NHTSA is investigating the accident and issued the following statement to The Verge:
“NHTSA’s Office of Defects Investigation is opening a Preliminary Evaluation of the design and performance of automated driving systems in the Tesla Model S.
“NHTSA recently learned of a fatal highway crash involving a 2015 Tesla Model S, which, according to the manufacturer, was operating with the vehicle’s ‘Autopilot’ automated driving systems activated. The incident, which occurred on May 7 in Williston, Florida, was reported to NHTSA by Tesla. NHTSA deployed its Special Crash Investigations Team to investigate the vehicle and crash scene, and is in communication with the Florida Highway Patrol. Preliminary reports indicate the vehicle crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway. The driver of the Tesla died due to injuries sustained in the crash.
“NHTSA’s Office of Defects Investigation will examine the design and performance of the automated driving systems in use at the time of the crash. During the Preliminary Evaluation, NHTSA will gather additional data regarding this incident and other information regarding the automated driving systems.
“The opening of the Preliminary Evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles.”
This article was reprinted with permission from Brad Templeton’s Robocars Blog.