Reports released reveal that one of Google‘s Gen-2 vehicles (the Lexus) had a fender-bender (with a bus) with some responsibility assigned to the system. This is the first crash of this type – all other impacts have been reported as fairly clearly the fault of the other driver.
This crash ties into an upcoming article I will be writing about driving in places where everybody violates the rules. I just landed from a trip to India, which is one of the strongest examples of this sort of road system, far more chaotic than California, but it got me thinking a bit more about the problems.
Google is thinking about them, too. Google reports it just recently started experimenting with new behaviors, in this case when making a right turn on a red light off a major street where the right lane is extra wide. In that situation it has become common behavior for cars to effectively create two lanes out of one, with a straight through group on the left, and right turners hugging the curb.
The vehicle code would have there be only one lane, and the first person not turning would block everybody turning right, who would find it quite annoying. (In India, the lane markers are barely suggestions, and drivers – which consist of every width of vehicle you can imagine) dynamically form their own patterns as needed.)
Google’s Take on the Accident
Our self-driving cars spend a lot of time on El Camino Real, a wide boulevard of three lanes in each direction that runs through Google’s hometown of Mountain View and up the peninsula along San Francisco Bay. With hundreds of sets of traffic lights and hundreds more intersections, this busy and historic artery has helped us learn a lot over the years. And on Valentine’s Day we ran into a tricky set of circumstances on El Camino that’s helped us improve an important skill for navigating similar roads.
El Camino has quite a few right-hand lanes wide enough to allow two lines of traffic. Most of the time it makes sense to drive in the middle of a lane. But when you’re teeing up a right-hand turn in a lane wide enough to handle two streams of traffic, annoyed traffic stacks up behind you. So several weeks ago we began giving the self-driving car the capabilities it needs to do what human drivers do: hug the rightmost side of the lane. This is the social norm because a turning vehicle often has to pause and wait for pedestrians; hugging the curb allows other drivers to continue on their way by passing on the left. It’s vital for us to develop advanced skills that respect not just the letter of the traffic code but the spirit of the road.
On February 14, our vehicle was driving autonomously and had pulled toward the right-hand curb to prepare for a right turn. It then detected sandbags near a storm drain blocking its path, so it needed to come to a stop. After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the center of the lane at around 2 mph — and made contact with the side of a passing bus traveling at 15 mph. Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it. (You can read the details below in the report we submitted to the CA DMV.)
Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.
This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements. In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.
We’ve now reviewed this incident (and thousands of variations on it) in our simulator in detail and made refinements to our software. From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.
As such, Google wanted their car to be a good citizen and hug the right curb when doing a right turn. So they did, but found the way blocked by sandbags on a storm drain. So they had to “merge” back with the traffic in the left side of the lane. They did this when a bus was coming up on the left, and they made the assumption, as many would make, that the bus would yield and slow a bit to let them in. The bus did not, and the Google car hit it, but at very low speed.
The Google car could have probably solved this with faster reflexes and a better read of the bus’ intent, and probably will in time, but more interesting is the question of what you expect of other drivers. The law doesn’t imagine this split lane or this “merge.” and of course the law doesn’t require people to slow down to let you in.
But driving in so many cities requires constantly expecting the other guy to slow down and let you in. (In places like Indonesia, the rules actually give the right-of-way to the guy who cuts you off, because you can see him and he can’t easily see you, so it’s your job to slow. Of course, robocars see in 360 degrees, so no car has a better view of the situation.)
While some people like to imagine that important ethical questions for robocars revolve around choosing who to kill in an accident, that’s actually an extremely rare event. The real ethical issues revolve around this issue of how to drive when driving involves routinely breaking the law – not once in a 100 lifetimes, but once every minute. Or once every second, as is the case in India. To solve this problem, we must come up with a resolution, and we must eventually get the law to accept it the same what it accepts it for all the humans out there, who are almost never ticketed for these infractions.
So why is this a good thing? Because Google is starting to work on problems like these, and you need to solve these problems to drive even in orderly places like California. And yes, you are going to have some mistakes, and some dings, on the way there, and that’s a good thing, not a bad thing. Mistakes in negotiating who yields to who are very unlikely to involve injury, as long as you don’t involve things smaller than cars (such as pedestrians.) Robocars will need to not always yield in a game of chicken or they can’t survive on the roads.
In this case, Google says it learned that big vehicles are much less likely to yield. In addition, it sounds like the vehicle’s confusion over the sandbags probably made the bus driver decide the vehicle was stuck. It’s still unclear to me why the car wasn’t able to abort its merge when it saw the bus was not going to yield, since the description has the car sideswiping the bus, not the other way around.
Nobody wants accidents – and some will play this accident as more than it is – but neither do we want so much caution that we never learn these lessons.
It’s also a good reminder that even Google, though it is the clear leader in the space, still has lots of work to do. A lot of people I talk to imagine that the tech problems have all been solved and all that’s left is getting legal and public acceptance.
There is great progress being made, but nobody should expect these cars to be perfect today. That’s why they run with safety drivers, and did even before the law demanded it. This time the safety driver also decided the bus would yield and so let the car try its merge. But expect more of this as time goes forward.