Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> For me it's obvious the real issue are the edge cases - of course a self-driving car does fine in 99%+ of cases, but those small % of outlier cases are the ones where people potentially die. I suspect an entirely different approach is required to supplement/supplant the deep learning one and fully autonomous vehicles are still quite some way off.

Why is that every time the safety of self driving cars is brought up, people appear to ignore that human drivers are not 100% perfect. There are on average of 104 deaths on the road each day in the US. if we can get that number to 50 people, that's a net positive.

Self driving cars don't need to be perfect, just better than us.

So the question shouldn't be "are SDC perfect yet?" but "are they better than us?"



I think you misinterpret what I said, it is the edge case scenarios where fatalities tend to occur. Of course in humans that can be due to drunkeness or distraction or confusion, nobody is claiming human drivers are perfect, but if it is the 'weird' situations that tend to result in accidents and deep learning tends to make mistakes in 'weird' circumstances, that doesn't fill me with confidence.

Also it is not at all obvious that self driving cars will do better than humans. Perhaps they will, but it isn't just about numbers. As I alluded to in another comment, imagine if they had 50% of the human death rate but it was all small children and occurred seemingly randomly. Do you think that would be acceptable and a better outcome? These things are less black and white than they might seem.

Another factor is that with deep learning it is fully a black box. It is very difficult to understand WHY a particular course of action was taken, whereas in human cases you can usually determine this. The why might also be limited, e.g. 'human child was mislabelled as stop sign' or similar, which just causes further questions.

The fact that you have scenarios the car will definitely not know how to handle combined with a deep level of inscrutability makes it difficult to have confidence in issues being resolved or the technology improving on a safety basis.


If self-driving cars are marginally better at driving than humans, but people take more rides in them out of convenience, it's easy for the technology to result in a net increase of human deaths.


I don't think if all cars were replaced by self driving cars fatalities would go down. In any case there is little evidence to support that. So I am skeptical of this claim that people are being too hard on the self driving cars.


Is the manufacturer gonna take the liability for those deaths?


It seems obvious they'll have to provided the vehicles are being used and maintained in accordance with manufacturer instructions.

Would you get in a vehicle if you could be charged with manslaughter for killing someone through absolutely no fault of your own?

Conversely, can imagine society collectively throwing up their hands in the event of a death caused by an autonomous vehicles and going "Them's the breaks. It's all for the good of society."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: