Should your self-driving car kill you to save others?

Discussion in 'Intelligence & Machines' started by Plazma Inferno!, Jun 24, 2016.

  1. DaveC426913 Valued Senior Member

    Messages:
    18,935
    No there really doesn't.

    1] The situation where two options are identical in every way that a car can distinguish is too extreme to worry about. And if you insist on going to the extreme, then it wouldn't matter whch option it chose.

    2] We do NOT open the door to placing value based on knowing a person's livelihood. That is a slippery slope I am certain no sane citizen will go down.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. DaveC426913 Valued Senior Member

    Messages:
    18,935
    Yes, but we can extrapolate to what a car might have to decide if it could not avoid an accident. It can detect pedestrians; it will try to avoid them, but what if it can't? This is the leaky edge we're exploring.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. billvon Valued Senior Member

    Messages:
    21,634
    Then it will still try.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. DaveC426913 Valued Senior Member

    Messages:
    18,935
    Obviously, but try how?

    If a pedestrian jumps out in its path, and its only option is blocked by other pedestrians, can it make a decision to cause the least damage? What criteria might it use to do so?
     
  8. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    But there's no reason why they couldn't.
     
  9. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    The day after self-driving cars appear on the market, you'll be able to download hacks for the software.
     
  10. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    That's what's called a bug. "We never thought it would happen so we didn't bother to program for the possibility." Think Y2K.
    Sanity has never held humanity back before.
     
  11. billvon Valued Senior Member

    Messages:
    21,634
    True. There's no reason they couldn't target red cars for ramming, either - but also no good reason they _would_ do that.
     
  12. DaveC426913 Valued Senior Member

    Messages:
    18,935
    No, it isn't a bug. A bug is software behaving in a way other than intended.
    There is no down-side to it, so there's nothing to program against.

    If the AI was faced with two choices, both equally bad, and it chose one over the other, that is not a fault. By definition, since both are equally bad, it doesn't matter.
     
  13. Jeeves Valued Senior Member

    Messages:
    5,089
    Exactly. That's why it should not be installed, not be programmed, not be designed and not be conceived. If the software architects and hardware engineers all practice reasonable prevention, they won't get anyone into these silly dilemmas.
     
  14. Sarkus Hippomonstrosesquippedalo phobe Valued Senior Member

    Messages:
    10,353
    But you miss the point that these are autonomous vehicles we're talking about, right? There is no breaking of speed limits, running stop signs or cutting in etc. The car is already driving "safely". The question is who would the car prioritise: passenger or pedestrian, in the event that the car has to "choose" one over the other.

    Any issue of "poor driving" is thus a red-herring in this matter, as far as I can see.

    That altruistic drivers would set their car's moral compass differently to others is not saying that they would be driving any more safely... as the car itself is already driving, and presumably driving safely. To me that latter aspect is a given, and this question is purely and simply about at a choice that the car would effectively have to make.
     
  15. Jeeves Valued Senior Member

    Messages:
    5,089
    No I didn't miss the point. Aggressive driving habits do not result from lack of skill (could have sworn I mentioned this, but maybe the herring ate it); they result from a lack of consideration for others.
    It's not a question of "poor" drivers, but of selfish drivers.
    You suggested selfish drivers be give the option of installing their moral standard into an autonomous vehicle, and I didn't like the idea.

    And I keep saying the probability of an autonomous car ever getting itself into such a quandary is so low as to be unworthy of consideration. And even if the situation could somehow be contrived, the computer could not, would not and should not be expected to make a moral choice. Its job is to make the smartest decisions available, at all times, without unnecessary emotional constraints placed upon it.

    Do you guys want gridlock while all the poor robots burn out their microcircuits, trying to figure out what Papa would approve of?
     
  16. Sarkus Hippomonstrosesquippedalo phobe Valued Senior Member

    Messages:
    10,353
    No I didn't suggest that. I suggested that they set up testing to be able to calibrate - I.e. From within a defined set of parameters, none of which will be "selfish" - and more specifically to set up the car in line with the drivers own morality with regard this single issue. This is, after all, an autonomous car, which will drive as safely as possible and abide by all Highway Code. Being selfish or not won't be an issue.
    But at no point did I suggest, or in any way imply, the entire driving habits or reasons for them (I.e. this entire moral standard) be uploaded into the car. Consideration for others will be a given in any autonomous car. At least that is how I see it.

    In my view, the question of whether you would sacrifice your own or the pedestrian's life has nout to do with how selfish a driver you are. It is simply a matter of how much you value your life compared to the life of others. You could be the most considerate and unselfish driver around but value your own life above everyone else's in such a situation: better to live with the consequences than not live at all, etc.
    This is simply begging the question: what does it mean to make "the smartest decision" in the case when it needs to put at equal risk of fatality either the occupant or the pedestrian? Answering the hat question with just "it will make the smartest decision" is simply no answer at all.
    No, which is why giving it an ultimate default for when all other decision-making produces no clear result sounds like quite a good thing to do.


    And yes, it's a hypothetical question, sure, but still worthy of consideration, not least because someone somewhere will have to program relative values for such things into the AI.

    Call it a matter of morals or simple programming or anything else. It wouldn't be emotions, that's for sure (unless we're talking seriously advanced AI) and it is a question that will be asked again and again. Whenever an autonomous car crashes there will/should be post-mortem of the decision making (as far as possible) and at some point there will be a scenario where the car has prioritised one life over another, not necessarily through direct decision making of which to sacrifice but through whatever priority it does have. Who gets to determine what that priority is? That's the question here.
     
  17. Jeeves Valued Senior Member

    Messages:
    5,089
    Oh. Sorry for failing to see that distinction.
     
  18. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    But it's a human being making the ultimate choice about what is "equal" i terms of badness. Should I program it to save the surgeon, who has a known value, or the unknown pedestrians? Don't kid yourself that no programmer will make the obvious choice.
     
  19. billvon Valued Senior Member

    Messages:
    21,634
    And don't kid yourself that programmers have that choice to make. The system isn't good enough to make that decision.
     
  20. DaveC426913 Valued Senior Member

    Messages:
    18,935
    (We were supposing, for the sake of argument, that it was technically possible. The issue was: would we make use of such knowledge? Well, that wasn't an issue really, because no sane adult thinks that a machine should be programmed to choose one life over another based on some value of their life).

    The surgeon has as known a value as any other unknown pedestrian.

    But what does the programmer have to do with it? Since we don't write the software to check the pedestrian's Facebook profile in the first place, there's no choice for the programmer to make.
     
  21. Sarkus Hippomonstrosesquippedalo phobe Valued Senior Member

    Messages:
    10,353
    One is with regard how one uses the road in general, the other is about a willingness to die instead of killing a pedestrian.
    It is indeed a distinction that is rather important, so there is no need be sarcastic about it.
     
  22. Jeeves Valued Senior Member

    Messages:
    5,089
    If you are asserting that selfish people who disregard the safety of others on the road "in general" will set their permanent preference to sacrifice themselves for a stranger, then the sarcasm was quite necessary.
     
  23. Sarkus Hippomonstrosesquippedalo phobe Valued Senior Member

    Messages:
    10,353
    That is not what I am asserting, but thanks for the strawman.
    But even if you think, as you seem to do, that all people who drive selfishly would choose themselves over the pedestrian, you have missed the occupants who are not themselves selfish drivers. Do you similarly think that they would all choose the pedestrian over themselves given the dilemma in question?
    Are you a selfish driver? Would you choose yourself or the pedestrian?
     

Share This Page