Should your self-driving car kill you to save others?

Discussion in 'Intelligence & Machines' started by Plazma Inferno!, Jun 24, 2016.

  1. Sarkus Hippomonstrosesquippedalo phobe Valued Senior Member

    Messages:
    10,353
    I would like to think that cars would drive themselves at speeds commensurate with the level of risks they face. We have speed limits for a reason, and we have lower speed limits in built up areas for a reason.
    In the example in the OP, if the automated car is in a location where pedestrians are likely, it surely shouldn't be going too fast so as to end up killing anyone.

    Maybe the car sellers should devise a VR simulation/test of various moral situations for the purchaser by way of calibration, and then set the car's systems accordingly.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Jeeves Valued Senior Member

    Messages:
    5,089
    I'm not sure it's a good idea to have self-driving cars that are calibrated to the personality of the drivers who are making the roads so dangerous now.
    The number of traffic fatalities in the US increased steadily as the number of cars increased, until about 1980, and then began to decrease (even though car travel kept increasing). In 2014, it was down to the level of 1918.https://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in_U.S._by_year That's because of safety measures in traffic control, road construction, law-enforcement and, most of all, the design of cars. (except maybe Toyota) Humans didn't grow smarter, more careful or more skillful - cars did. And will continue to.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    7,985
    Yes... an wit its beter reaction time fewer kids dartin into the road woud wind up as road kill.!!!

    I thank all cars shoud be calibrated to behave in the same way so people walkin woud beter know what to expect when encounterin one of these cars.!!!
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. gmilam Valued Senior Member

    Messages:
    3,522
    A computer (or human driver) won't have access to those variables. So they aren't really relevant.
     
  8. Sarkus Hippomonstrosesquippedalo phobe Valued Senior Member

    Messages:
    10,353
    I merely mean calibrating them morally: do they kill the pedestrians or the driver etc, not calibrate the bad habits such as running stop lights, speeding etc.
    The car should be designed to be as safe as possible for all, and will likely be far more aware of the surroundings, the road condition etc. So it is only the "moral" decision making that would need to be calibrated.
     
  9. Jeeves Valued Senior Member

    Messages:
    5,089
    But the moral - or more accurately, social - attitude is what separates the good and bad drivers, far more effectively than skill. It's the selfish and heedless who speed, run stop signs and cut in. So they would set their car's moral standard to Maximum Me, while the responsible, altruistic drivers, who are already careful and law-abiding, would set theirs to Save the Child at any Cost. The meek would become prey to the aggressive - SOP.
     
  10. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    I guess I'm saying that you can oversimplify what you tell a computer to do. You can tell it to save three and kill only one. But what if you have one group with two big ones and one little one and another group with one big one and two little ones? If you're arbitrarily putting more value on many than on few, what arbitrary criteria do you use to choose between groups of equal numbers?
     
  11. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    Well, yes, the driverless car could easily know who the surgeon in the back seat is. And the hospital administrators could easily program the car to save him at (more or less) all costs. At some point, human decisions are made. You can have them made cold-bloodedly in an office or you can have them made on the spur of the moment in the driver's seat.
     
  12. EddyNashton Registered Member

    Messages:
    22
    I think to remedy this we should have an override feature that a person can press in case stuff like this are going to happen.
     
  13. Jeeves Valued Senior Member

    Messages:
    5,089
    If the human has time to override the computer and make a decision and carry out the maneouvre , the car had time to avoid the accident.
     
    DaveC426913 likes this.
  14. DaveC426913 Valued Senior Member

    Messages:
    18,935
    Now we're cutting a really fine line.
    Not only are there only two options, but both options have the same probability of casualty AND both casualties are perfectly matched. This becomes reducto ad absurdum.

    Yes. That was my point. It's a ridiculous criterion by which judge whether someone should live or die.

    I can't speak for you, but my life is not worth less than a surgeon's. Anyone else will say the same about themselves.

    What? No they couldn't.

    Yes. That saving the most lives is best. Not saving this guy cause we really really like him.
     
  15. Jeeves Valued Senior Member

    Messages:
    5,089
    Anyway, the surgeon beats his wife and cheats on his income tax. Your car didn't know that, because it wasn't programmed to probe deep background. Needs work.
     
    DaveC426913 likes this.
  16. DaveC426913 Valued Senior Member

    Messages:
    18,935
    Could not have said it better myself.
     
  17. Jeeves Valued Senior Member

    Messages:
    5,089
    The relative value of suddenly-lurching pedestrians was always a red herring.
    The relevant questions would be:
    What contingencies might arise too fast for a computer to be aware of them in time to avoid calamitous choice? and
    Is there any way to program for such contingencies? and
    If so, on what basis?
     
  18. billvon Valued Senior Member

    Messages:
    21,635
    But his wife is a child molester, and he only hits her to protect the children . . .

    This is all pretty silly. No autopilot today makes such decisions. No autopilot in the forseeable future will, either.
     
  19. Jeeves Valued Senior Member

    Messages:
    5,089
    That was the point! Nobody makes such distinctions or such decisions. Nobody can.
    If a human driver has neither the information nor the time to choose - and he doesn't! - in the actual situation, then the whole idea of humans programming a machine to do it in all possible, as yet unknown, situations, is far beyond silly.
     
  20. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    The point is that if a computer has to decide between the two, it has to have some criteria to work with. No matter how ridiculous you think those criteria are, no matter how arbitrary they are, there have to be criteria. The surgeon's car is more likely to choose him than you or me.

    That's your opinion, which doesn't really matter. The car is programmed to do what it's programmed to do. Unless you dictate the programming, your opinion has no effect on it.
     
  21. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    The car doesn't need to know that. All it needs to know is that the switch is set to, "Save occupant at all costs."
     
  22. Jeeves Valued Senior Member

    Messages:
    5,089
    Only if an engineer had built in a switch that can say that. I don't believe a sufficiently convincing case has been made for its installation.
     
  23. billvon Valued Senior Member

    Messages:
    21,635
    Self driving cars do not make decisions in line with "saving" occupants. They make decisions to avoid collisions. Thus the position of the switch would be meaningless to the car's driving algorithm.
     

Share This Page