Maybe each car company will have to pay a fine for each person the autopilot car kills.. Don't worry, Trump / Clinton will decide on the priorities.
Why is this even brought up? So if you are a human driver and you encounter this situation, what do you do? Is there even a right answer? I would bet that an AI driving robot would detect a potential collision way before a human could and would avoid the situation entirely.
this unfortunate incident is just another point that we do not have legal regulation for self driving cars. It will take years while the Government generates some regulations (unless suddenly they become extremely productive) and it will take years for industry to polish the technology. There will be more accidents and the policies and technology will be "polished" on these accidents.
Because I brought it up. Because when you program, you plan for ALL kind of situations. And you need answers to those moral questions. And no, an AI can not avoid every bad situations as long as humans also driving. But it could be just an accident, a car coming toward you blowing a tire, noone can predict that.
Why autopilot cars are even allowed on the roads? Are we there yet in technology? Can those cars see in the middle of the night in a blizzard? If I need a driving licence, shouldn't a company pass some kind of test before their software is allowed to drive cars or even trucks? How about the labour issues? (this is more politics than technology) If suddenly trucks can drive themselves, that is a lots of jobs lost....
Ok, so if you are driving the car, what is your decision? Your question is irrelevant. I would take my chances with AI driving vs. human drivers any day of my life.
Yes, rather have AI than humans, especially some of the ones I know. These errors will get worked out. I'm sure humans were just as concerned when their horses and buggies were supplanted 100+ years ago. Also, AI might be something like vaccinations: once everyone has one, you can stop worrying. In other words, maybe there won't be any more dangerous situations created, so you won't have to avoid an accident because there are no idiots driving that precipitate them. Maybe the insurance companies will more or less force us to use AI by charging different rates and making it prohibitively expensive to drive. In my 45 minutes of errands yesterday, I saw: -woman in hurry driving partially on sidewalk to illegally pass another car on right -man driving 45 mph on freeway with speed limit of 60 mph -truck edging over lane markers on freeway -woman yakking on phone even though it's illegal in this state -man driving with golden retriever on lap -oncoming driver (gender not clear) running stop sign -man in sports car driving 50 mph in 25 mph zone with construction workers around and cones up -man in tiny car tailgating me Shit, makes me want to get in bed and put the covers over my head.
Of course if I am driving the car, I'm going to do whatever saves MY life. The hell with my car and with everyone else and their car. But I might not be adept enough, in a split second, to figure out what that course of action is.
Self preservation. Except the software doesn't need to save itself, so it has to make a choice, does he save the 1 passenger by killing 2 bikers in the next lane, after all the passenger bought the software/car. But it is not fair to the bikers, not to mention 2 lives for 1... So the 2 bikers' family will sue the software company into oblivion/bankruptcy... So yes, the question is VERY relevant... And if they program the software not to save the passenger, who is going to buy the car/software???
Maybe they'll let me change the settings! But I'll have to check every day to make sure a software update does not put me back to trading my life for someone else's.