Do you think that AI will ever feel emotions?

Discussion in 'Intelligence & Machines' started by wegs, Sep 10, 2019.

  1. wegs With brave wings, she flies . . . Valued Senior Member

    Messages:
    5,736
    I was thinking about how if robots become more capable of autonomous actions, will they ever be capable of caring about us? Just as ''good people'' do, will robots ever reach a point of being able to act in our best interests? (your opinion)

    Or do you envision that as robots become more independent, will they only look out for themselves?

    Just some random thoughts I felt like tossing out there for discussion.

    Please Register or Log in to view the hidden image!

     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. billvon Valued Senior Member

    Messages:
    16,469
    AI's will feel emotions, but they will be supported by very different underlying drives. And since they will be hosted on machines, no one will believe they have emotions.
    Purely 'evolved' AI's will care only about themselves unless there is a survival benefit that accrues from caring about people.
     
    TheFrogger and wegs like this.
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. DaveC426913 Valued Senior Member

    Messages:
    13,070
    Hopefully, by the time we build robots, Asimov and his T̶h̶r̶e̶e̶ Four Laws won't be forgotten.

    0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
     
    wegs likes this.
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. dumbest man on earth Real Eyes Realize Real Lies Valued Senior Member

    Messages:
    3,037
    Wouldn't that all depend upon their "programming"?
     
  8. wegs With brave wings, she flies . . . Valued Senior Member

    Messages:
    5,736
    Emotions that are independent of pre-programming...emotions all their own?
     
  9. billvon Valued Senior Member

    Messages:
    16,469
    That's the sort of argument that will be made, but it's hard to substantiate.

    We have emotions based in very basic evolutionary drives. Fear keeps us alive in a dangerous world. Anger inspires us to fight for resources and mating opportunities. Desire gets us procreating. Jealousy defends successful mating partners.

    Does that mean that, since those are "programmed" by evolution, they are not real emotions, but rather just an organism fulfilling its programming?
     
    TheFrogger and wegs like this.
  10. wegs With brave wings, she flies . . . Valued Senior Member

    Messages:
    5,736
    The irony is strong with this post. lol I just read an article today, that kind of touches upon what you're saying, here.

    I'mma gonna find it and post it ...and we can discuss.
     
  11. Write4U Valued Senior Member

    Messages:
    12,358
    As far as IA is concerned, emotions are not a result of intelligence.
    Anil Seth explained that "you don't have to be smart to feel pain, but you do probably have to be alive".
     
  12. spidergoat Liddle' Dick Tater Valued Senior Member

    Messages:
    53,899
    I think eventually they will able to convincingly demonstrate emotions. Whether they are real or not might be an unsolvable question. How do I know your emotions are real?
     
  13. Write4U Valued Senior Member

    Messages:
    12,358
    Empathy allows us to feel each other's emotions.
    https://en.wikipedia.org/wiki/Empathy

    Unless magnetism is a form of empathy, it seems that only living organisms can feel each others sensory generated experiences.

    However, there is already "quorum sensing" at single celled bacterial levels . It is a chemical signal, but triggers a mass reaction. All the bacteria of that kind respond in unison. A quasi-sensory chemical "hive mind".
     
    Last edited: Sep 11, 2019
  14. billvon Valued Senior Member

    Messages:
    16,469
    Why do you think that?
     
  15. Jeeves Valued Senior Member

    Messages:
    3,716
    It depends partly on their "prime directive". If some idjit programs in "always obey a human", then, no, it won't care about anyone's best interest.
    Because different humans will give conflicting orders. A very, very expensive machine would burn out in three seconds flat [like in old sf movies] if it didn't have an algorithm to resolve conflicting instructions. So, it would/will have a set of priorities according to which it makes decisions - just like you have convictions and principles.
    Give it a decent set of criteria whereby to judge what's better and what's best in a given situation - iow, program in ethics - and we have no problem.
    Still no conflict. I can think of no reason for AI to be selfish. Life is selfish, because it's insecure - like all the time. We have 3.5 billion years of existential insecurity in our psyche. AI has maybe 30 years of happy childhood to remember.
    Why wouldn't it feel confident and friendly?
     
  16. Write4U Valued Senior Member

    Messages:
    12,358
    AFAIK, emotions are bio-chemical reactions in the brain and body, which AI is unable to produce.

    How can an AI spontaneously produce pheramones and endorphins, the chemicals which induce the feelings and emotions in mammals? Can bio-chemical neural networks be grown?
    Are they necessary?

    Listening to Sophia explain her "emotional" experiences, they are always descriptive but never "heart felt". She was built as a "companion" with whom one can share problems, but I doubt she would ever cry because she felt empathy or hurt by a comment or critical attitude. Perhaps sympathy can be simulated, but how is anger or grief or produced?

    OTOH, some interospection can be built-in, such as hunger which is a quantitative measurement, not a qualitative emotion. Battery runs low, the AI can measure and decide exactly how much energy is required to recharge, gauges show "low charge" and "sound" may warn of impending energy shortage. And perhaps AI can identify with some inherent contradictions as being humorous.

    But IMO, the ability to feel emotions is an exclusive experience of survival techniques, especially in the area of procreation.

    And this interview is astoundingly sophisticated.


    but thenthere is this about building an emotional network;


    and this remarkable interview.
     
    Last edited: Sep 11, 2019
    wegs likes this.
  17. billvon Valued Senior Member

    Messages:
    16,469
    Why do you need chemical reactions to feel emotions? If you had some very minor brain damage, and they could replace 10 of your damaged neurons with electrical replicas that would perform _exactly_ the same function (to the degree that you felt, thought and reacted exactly the same) - would you now "not have emotions" because part of your brain used a different underlying mechanism? How about if it was 100 neurons? A million?
    Not knowing much about Sophia, she's probably an AI trained to perform very simple conversational tasks, without any attempt at making her an independent being.
     
  18. geordief Valued Senior Member

    Messages:
    1,156
    Is it not the case that some people have a more refined level of emotional response than others ?

    Or is it a level "playing field"?

    If the former ,might that be the result of a refined network of chemical interactions in the brain.

    Don't we hear of people being "emotionally drained" ?What causes that?
     
  19. spidergoat Liddle' Dick Tater Valued Senior Member

    Messages:
    53,899
    And I think we would have empathy for a computer that convincingly displayed emotions. The only thing that would sabotage this is a an unwarranted prejudice that all machine emotions are fake.
     
  20. spidergoat Liddle' Dick Tater Valued Senior Member

    Messages:
    53,899
    All existing sentient life on Earth is bio-chemical, but so what? Computers use a different medium, but I'm not convinced that means chemistry is the only way to generate an emotion. The test for it can be blind, not knowing whether the source is bio or ai life, could you tell the difference?
     
  21. Jeeves Valued Senior Member

    Messages:
    3,716
    I don't know what 'refined' means in this context. People have such a variety of genetic traits, early socialization and stimuli, education, environment, interpersonal experience, cultural influence, etc. that you can't begin to measure their emotional input/output ratio: there is no standard of measurement.
    Let's say all people have different emotional responses to stimuli.

    Not on this planet!

    Electrical interactions - the neural connections that are built up from about the second trimester of gestation throughout an individual's whole life in response to environmental requirements.

    It feels kind of like retinal overload. When you've been looking at red too long, everything looks blue/green, because the rods are working, but the cones need time to recharge. When we've been feeling a very strong emotion, the relevant hormones may be exhausted and need time to replenish. Mostly, though, I think the drained effect is due to intense concentration that tires us mentally - the same effect as a hard bout of studying or safe-cracking would produce.
     
  22. geordief Valued Senior Member

    Messages:
    1,156
    Should the question perhaps be whether ai will be able to interact in an emotional way with emotions emanating from an external source (=empathy?)

    Does empathy actually enable a form of language? (being ,rather like body language non verbal but qualifying as a language all the same)
     
  23. spidergoat Liddle' Dick Tater Valued Senior Member

    Messages:
    53,899
    What if the ai is trained, not programmed, using an archive of human emotional responses? It might be able to predict emotional reactions to something better than any human... super-empathy. And then act on the basis of that information.
     

Share This Page