Can artificial intelligences suffer from mental illness?

Discussion in 'Intelligence & Machines' started by Plazma Inferno!, Aug 2, 2016.

  1. Write4U Valued Senior Member

    Messages:
    20,069
    Well, we would need to figure out how to make an AI out of bio-molecules, before we can have feedback from self examination.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. billvon Valued Senior Member

    Messages:
    21,635
    That's the $64,000 question.

    My belief is that if a machine simulates the same activity your brain once supported, there would be no difference to you. You would be as "independently aware" of your surroundings as you always were - because the patterns of electrical activity that gives rise to you would still be there, just running on different hardware.

    And we also have ways to mute it (SCS, drugs.) It would almost certainly be easier if more of our central nervous system could be controlled.
    Well, but there's two different kinds of pain. There's the physical and the emotional. The physical level is easy to treat; emotional, not as easy.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. billvon Valued Senior Member

    Messages:
    21,635
    Good example. And yes, replacing the hard drive with a new one would require you reload all your software, replace your files and server preferences etc. You would effectively be "wiping" the computer, and would have to start over - and you'd end up with a slightly different computer at the end of the process, because your browser history would be different, your software would have different install dates, you'd have a different revision of Flash and Java etc.

    But let's say you were conscientious about backing up your computer every night just in case. And one night your brilliant but absentminded uncle destroyed your hard drive by spilling coffee on it. Feeling guilty, he ran down to Fry's, bought a new hard drive, replaced it, then restored the last image you saved of your computer.

    The next day, would you know the difference?
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. James R Just this guy, you know? Staff Member

    Messages:
    39,397
    Actually, studies have shown that children, in particular, quickly form empathic bonds with robots that appear to act intelligently.
     
  8. James R Just this guy, you know? Staff Member

    Messages:
    39,397
    Previously in this thread I gave the example of a computer hooked up to a temperature sensor (for example). That is a "nervous system", to all intents and purposes - it transmits sensory information from the outside world to a place where it can be processed.

    Suppose, for example, that any temperature above 100 degrees Celcius was considered to be dangerous to the computer connected to the sensor. Then, at a temperature above 100 degrees, the computer would register a "danger" signal of some kind, after processing the "raw" input.

    Already, right there, we have a glimmer of an "experience" equivalent to "pain" for the computer. Somewhere in its innards, it is equating high temperature with "this is bad", or "I need to take some action to avoid this condition".

    You will probably want to argue at this point that the machine is lacking a vital element - a conscious appreciation of "pain". But in that case, you're not really arguing about a machine's capacity to suffer (to feel pain) - you're arguing more basically that no machine can be conscious of anything at all.

    But you'd need to flesh out that particular argument. Why should consciousness be a purely biological trait? Why is consciousness impossible for a machine made from silicon chips?
     
    cluelusshusbund and Write4U like this.
  9. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    Good post.

    Certainly questions posed within post are food for thought and digestion.

    Will give dictionary compilers headaches redefining such words as consciousness.

    I have not thought (lately) about consciousness being present in a machine made of silicon chips.

    Your example of temperature detection good but restricted to just that one detect - alert - protect when humans are so much more.

    So the question for me would be how many (few) of the detect - alert - protect systems (and which ones) are needed to define the silicon chips as having consciousness?

    Coming from a slightly different direction we know unconscious persons still respond to stimuli. How much (many) stimuli needs to be reawakened for consciousness to be restored

    Old persons and those with loss of memory who function very well looking after themselves, but have no idea of who they are or have knowledge what could be considered normal facts, do they have consciousness?

    My guess is that there are levels of consciousness but the highest level would be when the entity (be it human or silicon chip) indicates "I know that I know".

    Shades of "I think therefore I am".

    Being a tiny bit flippant I would be sure a silicon chip had consciousness when one comes to me and ask if they could date my daughter.

    Please Register or Log in to view the hidden image!



    Mad, may be, but having consciousness

    Please Register or Log in to view the hidden image!



    Humpty Dumpty eating chips not spitting them.

    Please Register or Log in to view the hidden image!

     
  10. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    Why is it important for AI to be conscious? Be careful what you wish for, because we might have a difficult time ''using'' machines if we know that they're aware of being used.
     
  11. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    I wouldn't know the difference in that scenario. lol But swapping out my brain for a piece of machinery to take over its function...I would fail to be ''me'' anymore. Even the best of programmers couldn't get it all right, something vital would be left out. Our memories and experiences are what make up who we are, just as much as physical matter. You can't just replace a body part with an alternative, in hopes that no one will notice.

    Please Register or Log in to view the hidden image!

     
  12. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    7,985
    Do you thank any animal besides humans are conscious.???
     
  13. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    True.

    Think of dogs being aware that catching a frisbee entertains a human.

    Guess it goes to the dog if it still wants to play catch.

    We might have to negotiate with aware machines.

    Another aspect of machines being conscious I think is more to do with humans being able to respond to events not within the humans experience (which is not consciousness).

    I suspect the ability to respond to events outside of programming more than consciousness is what is being sort.

    Conscious Dumpty

    Please Register or Log in to view the hidden image!

     
  14. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    A few appear to be.
     
  15. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    Definitely yes.
     
  16. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    Only a very few
     
  17. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    True, as well.

    We would have a sense of agency in terms of how to ''interact'' with AI if AI was to be ''given'' consciousness. As it stands now, we use machines and computers for our benefit, without concerning ourselves with anything beyond that. If computers and machines were to become aware of their surroundings, and how we are ''using'' them, we might feel guilty knowing that we are using a device that has a sense of awareness and a perception of pain.
     
  18. wegs Matter and Pixie Dust Valued Senior Member

    Messages:
    9,253
    Consciousness is awareness, nothing more or less. More than a few are aware of their surroundings. The ''fight or flight'' response is largely related to consciousness, in the animal kingdom.
     
  19. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    From Mr Wiki

    About forty meanings attributed to the term consciousness can be identified and categorized based on functions and experiences. The prospects for reaching any single, agreed-upon, theory-independent definition of consciousness appear remote.

    I'm invoking the Humpty option and dropping out before this becomes a debate over which of the 40 odd definitions is the best.

    Humpty Dumpty over and out.

    Please Register or Log in to view the hidden image!

     
  20. billvon Valued Senior Member

    Messages:
    21,635
    The studies I have seen indicate that the degree of bonding is proportional to how anthropomorphic (shape, sound, behavior) the robot is. (That's not unique to humans; even primate infants who are separated from other monkeys will form bonds with anything remotely resembling a primate.) I don't know if I have seen the same studies you have, though.
     
  21. Write4U Valued Senior Member

    Messages:
    20,069
    Consciousness can be defined by several standards, and almost all forms of life have *a consciousness* of their surroundings.
    IMO, it is not so much a question of consciousness (awareness) as it is the ability to experience emotions.

    Humans feel happy or sad when their lives are drastically changed, many other species have similar emotions. When we get sick we strive to get well, often with the aid of bio-chemicals (medicine)

    But (assuming that we cannot grow human brains), an AI will experience reality different from human experience as with all things possessing awareness.

    Perhaps in AI, emotions of happy or sad could be a fundamental response *in the direction of greatest satisfaction*, i.e. running at greatest efficiency and *optimum performance* vs awareness of *needing a tune-up* and self-repair, a measured self-awareness of organization and performance and a tendency to adjust for a natural mathematically efficient operation (function).
     
    Last edited: Dec 21, 2016
  22. Write4U Valued Senior Member

    Messages:
    20,069
    The gorilla Koko and her manx kitten "All Ball" a name she invented herself because the kitten was all ball (no tail)

    and a lovely and heart rending video:
     
  23. James R Just this guy, you know? Staff Member

    Messages:
    39,397
    I have to ask: Important to whom? Important for what?

    It depends what you want from your AI, I guess, as to whether consciousness is important.

    Yes. If they are like us, they will have their own goals and desires like us. They might well have goals and desires different from ours.

    Personally, I would advocate that certain restrictions be built into AIs. Perhaps something similar to Asimov's three laws of robotics.
     

Share This Page