Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't believe I'm delusional, my code is certainly medium-to-okay. I've put a lot of thought into this. I think autonomous cars are a very good idea, I want to work on building them and I want to own one as soon as safely possible.

Autonomous cars, as they exist right now, are not up to the task at hand.

That's why they should still have safety drivers and other safeguards in place. I don't know enough to understand their reasoning, but I was very surprised when Waymo removed safety drivers in some cases. This accident is doubly surprising, since there WAS a safety driver in the car in this case. I'll be interested to see the analysis of what happened and what failures occurred to let this happen.

Saying that future accidents will be "unexpected" and therefore scary is FUD in its purest form, fear based on uncertainty and doubt. It will be very clear exactly what happened and what the failure case was. Even as the parent stated, "it saw a person with stripes and thought they were road" - that's incredibly stupid, but very simple and explainable. It will also be explainable (and expect-able) the other failures that had to occur for that failure to cause a death.

What set of systems (multiple cameras, LIDAR, RADAR, accelerometers, maps, GPS, etc.) had to fail in what combined way for such a failure? Which one of N different individual failures could have prevented the entire failure cascade? What change needs to take place to prevent future failures of this sort - even down to equally stupid reactions to failure as "ban striped clothing"? Obviously any changes should take place in the car itself, either via software or hardware modifications, or operational changes i.e. maximum speed, minimum tolerances / safe zones, even physical modifications to configuration of redundant systems. After that should any laws or norms be changed, should roads be designed with better marking or wider lanes? Should humans have to press a button to continue driving when stopped at a crosswalk, even if they don't have to otherwise operate the car?

Lots of people have put a lot of thought into these scenarios. There is even an entire discipline around these questions and answers, functional safety. There's no one answer, but autonomy engineers are not unthinking and delusional.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: