I think the same accident situation gets more complex with the usage of AI. If I kill 2 bikers by suddenly reacting and saving myself, that is involuntary manslaughter and an understandable human reaction. It probably happens so quickly there is no conscious decision just muscle reflex. But with autopilot and programming, someone programmed the software to decide who lives and who dies. That is not involuntary but very voluntary manslaughter. Here is a very similar situation 7 people died altogether: (video is slowed down) Normal speed:
You are worrying about minutiae. An algorithm should protect its passengers at all costs, just like a driver would. Problem solved.
The software AI might be immature, but eventually it will get better. But the human will always be the weakest link. The guys in the above video didn't understand the technology they were using, they thought they had pedestrian detection when in fact they only had vehicle detection in that car.
shoulda woulda coulda you are clueless. In your perfect little world everything works as intended and there are no accidents. Got it...
What the f*ck are you talking about? You think there will be more or less accidents with AI driving? That is all that is important, not some technicality about decision making at the moment accidents occur. I would much rather have all cars with AI driving, than risk drunk drivers, texters, etc.
There will always be accidents AI or not, but less accidents, less chance of having to make the decision of who'm to kill so over all much safer, perfect ofcourse not. Most of the time accidents happen so quickly even AI won't have time to do anything, still going to take time, to brake or steer or speed up.
Yeah there is a brand new technology, why should we regulate it,right? Let's say it is flying cars, who needs regulation, you moron.... Let me put you on Ignore, makes life easier...