Tesla driver killed in crash while using car's 'Autopilot'

Discussion in 'Wall St. News' started by dealmaker, Jun 30, 2016.

  1. Pekelo

    Pekelo

    I agree, but that is why these questions should be debated and a consensus made up on the solution/rule/guidance, instead of just letting each company to decide how they program their cars.
     
    Last edited: Jul 8, 2016
    #41     Jul 8, 2016
    Simples likes this.
  2. sprstpd

    sprstpd

    It would be nice to nail down these sorts of rules, but then you are holding AI cars to a higher standard than humans (i.e., in quick crash type situations, humans cannot be expected to be rationale and make "good" choices). I am not sure that is the most productive path to eliminating human drivers from the equation ASAP.

    Also, when AI cars make up a large percentage of traffic, you can start to make things a lot safer by having all cars have emitters so that the cars can communicate with each other in real time. You might even have small emitters that pedestrians can wear. Or maybe new shoes have emitters embedded in them. Then the NSA can track everyone!
     
    #42     Jul 8, 2016
  3. Pekelo

    Pekelo

    Of course and why shouldn't we? A robot is better at manufacturing than the average factory worker. Also they are talking about robot judges to eliminate subjectivity in court cases...
     
    #43     Jul 11, 2016
  4. sprstpd

    sprstpd

    Because it delays the wide scale adoption of AI cars which means the loss of many lives that could have been avoided.
     
    #44     Jul 11, 2016