AIs smarter than humans... Bad thing?

Discussion in 'Intelligence & Machines' started by Speakpigeon, Apr 23, 2019.

  1. Speakpigeon Valued Senior Member

    Messages:
    1,123
    James, an animal is something produced by natural selection.
    A machine is something conceived and produced by humans beings.
    Animals with a brain have required at least several hundred millions of years of evolution over the entire biosphere to exist.
    It is certainly logically possible that a machine could be smarter than a human being. But no engineering work could amount to even a small fraction of several hundred million years of evolution over the entire biosphere. So, it is just very, very unlikely that an AI smarter than a human being will ever be produced.
    However, whether it could is not the topic of this thread. I assumed in my OP that "we will indeed successfully design an AI smarter than us". So, your point is just a derail.
    EB
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Speakpigeon Valued Senior Member

    Messages:
    1,123
    I'm perfectly happy thinking about many humans as unthinking brutes, and also thinking about many animals as intelligent and sensitive beings.
    But a chimpanzee is not a human being and a human being is not a whale.
    You have no argument except analogy and analogy is crap.
    EB
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Speakpigeon Valued Senior Member

    Messages:
    1,123
    AIs won't have their own goals. They will have the goals that humans put in them. I know how software is produced. If an AI does something unexpected then it's a flaw in the design and the engineering firm will be responsible for the consequences and could suffer massive liabilities.
    It's a mistake from the start to think of an AI as if it was some kind of sentient being. It's just a machine and as such it can be deadly. You can't understand a machine if you think of it as very similar to us. Like all analogies, that analogy is crap and this one may well get you killed. AIs will remain different from humans except for intelligence and then not the same kind of intelligence. Most people will be fooled into thinking these things are really sentient. They will fall in love with them, literally. Dope for the brain-dead of this world.
    EB
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Speakpigeon Valued Senior Member

    Messages:
    1,123
    Yes, I also read Arthur C. Clarke and I like it a lot. I even remember spending some time considering the implications of the three laws. I had absolutely no problem imagining Arthur C. Clarke's robots, but imagining won't make actual AIs smarter than human beings.
    I'm well aware machines can do unexpected things. This isn't even anything new. This was true with the first machine we ever built. Any machine can surprise you. Any machine at all. So what? Does that make all machines more intelligent than humans?!
    Smart AIs would be a serious hazard for humanity. If these things are let loose without an effective control and a self-destroy mechanism, then the people would will have built it will be responsible.
    What is wrong in your attitude is that you are exculpating the culprit in advance of the crime. An AI is a machine and the designer will remain legally responsible for any liability. If you can't control the thing, don't make the thing to begin with.
    Then again, maybe the stupidity of humans is such we don't even need an AI really smarter than us to go extinct.
    You're underestimating the power and scope of my imagination.
    Please don't. Analogies are for idiots.
    I'm not here to argue endlessly about vacuous analogies. If you can't articulate your point, please abstain.
    EB
     

Share This Page