That's comparable to the autonomous vehicle accident reports Google files with the California DMV. Minor rear-ending at low speed is the most common problem. The usual situation is where the Waymo vehicle has an obstructed line of sight at an intersection and enters the intersection very slowly. Then the system sees cross traffic and stops. The human-driven vehicle, following the Waymo vehicle, then fails to stop fast enough. There's one intersection in Mountain View which has a tree in the median high enough to block the LIDAR but clear at window height. That causes a very cautious intersection entrance. Waymo vehicles have been rear-ended twice there. As LIDAR units get cheaper and more are fitted, that problems should be fixed.
Autonomous vehicles may need a "Back Off" signal, like flashing the brake lights at ultra-bright levels when someone gets too close.
"The one actual, non-simulated angled collision occurred when a vehicle ran a red light at 36 mph, smashing into the side of a Waymo vehicle that was traveling through the intersection at 38 mph."
Not Waymo's fault. I wonder if Waymo went to court, with full video of the driver running the red light.
Autonomous vehicles may need a "Back Off" signal, like flashing the brake lights at ultra-bright levels when someone gets too close.
"The one actual, non-simulated angled collision occurred when a vehicle ran a red light at 36 mph, smashing into the side of a Waymo vehicle that was traveling through the intersection at 38 mph."
Not Waymo's fault. I wonder if Waymo went to court, with full video of the driver running the red light.
This is encouraging.