Dilemma: Should we build machines more intelligent than human beings if we could?

Discussion in 'Science & Society' started by Speakpigeon, Oct 6, 2019.

?

Should we build machines more intelligent than human beings if we could?

Poll closed Nov 5, 2019.
  1. Yes, we should

    83.3%
  2. No, we shouldn't

    0 vote(s)
    0.0%
  3. I don't know

    16.7%
  4. The question doesn't make sense

    0 vote(s)
    0.0%
  1. Speakpigeon Valued Senior Member

    Messages:
    1,123
    This is a poll. Thank you to vote before posting comments.

    It seems plausible, if not rather likely, that one day humans will be able to build machines more intelligent than themselves. This would likely have all sorts of consequences, some of them good, some of them bad, for humanity as a whole or for some, possibly many, individuals. However, assuming we could do it, either we would do it or we wouldn't. Further, once someone discovers how to do it, it becomes very difficult not to do it. Governments will want to do it, the military will want to do it, business will want to do it, and many people individually will be minded to do it, making the outcome almost inevitable that we will build machines more intelligent than human beings.

    So, the question is, would you be in favour of building such machines or not?

    And what would be your argument for or against, if you have one?
    EB
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Beer w/Straw Transcendental Ignorance! Valued Senior Member

    Messages:
    6,549
    Natural selection ended for the human race when communities were built.

    Yes. It is inevitable.
     
    Last edited: Oct 6, 2019
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    We have built all kinds of machines that are stronger, faster, tougher than we are. That's kinda the whole point of machines. Why would smarter be different?
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. billvon Valued Senior Member

    Messages:
    21,635
    We already do, and they have vastly improved the human experience. Calculators and computers that can do complex math far faster, and more accurately, than a human can. Programs that can look at ten million pictures and pick out the criminal suspect in one of them in seconds. Autopilots that can fly aircraft far more accurately and efficiently than humans can. Elevator controls that effectively never make mistakes. This will continue, with machines exceeding human intelligence in more and more areas.
     
  8. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    7,985
    For 1… i dont resist the inevitable… for obvous reasons.!!!

    For 2... i get a chuckle out of people realizin that humans ant the be-all end-all... eg... thers nuthin about humans that cant be duplicated.!!!
     
  9. Jeeves Valued Senior Member

    Messages:
    5,089
    It's our only hope.
     
  10. Speakpigeon Valued Senior Member

    Messages:
    1,123
    Why inevitable? We could democratically decide to kill all AIs researchers or some such drastic measure. Once we know why this would be necessary, it is was, then we would do it.
    EB
     
  11. Speakpigeon Valued Senior Member

    Messages:
    1,123
    Fallacious reasoning.
    Stronger is different from faster, faster different from tougher, and tougher different from stronger. Why would smarter not be different?
    EB
     
  12. Speakpigeon Valued Senior Member

    Messages:
    1,123
    Why inevitable? We could democratically decide to kill all AIs researchers or some such drastic measure. Once we know why this would be necessary, it is was, then we would do it.
    Fallacious reasoning.

    1. We already duplicate ourselves, so nothing new here, and no problem. Indeed, it's one solution to our short lives.
    2. My question is not about "duplicated" intelligence, but superior, and indeed far superior intelligence.
    3. As far as I know, there isn't any machine now that exceeds all our abilities. A machine may go faster but it won't do all the myriad things we are capable of. So, there is not empirical evidence that we could even merely duplicate ourselves, let alone make a machine superior to us.
    4. And here, the question is about intelligence, and it is really questionable whether we could really produce machines more intelligent than us. As far as I know, as of now, we haven't done it.
    EB
     
  13. Speakpigeon Valued Senior Member

    Messages:
    1,123
    Could you be more specific? Global warming? Human stupidity? What?
    EB
     
  14. sideshowbob Sorry, wrong number. Valued Senior Member

    Messages:
    7,057
    That doesn't stop us from making machines that are stronger and faster
    That doesn't stop us from making machines that are faster and tougher.
    That doesn't stop us from making machines that are tougher and stronger.
    You tell us why it should.
     
  15. billvon Valued Senior Member

    Messages:
    21,635
    It is different - but it's also untrue. We DO have machines which are smarter than humans in many ways..
     
  16. Jeeves Valued Senior Member

    Messages:
    5,089
    Yes.
     
  17. C C Consular Corps - "the backbone of diplomacy" Valued Senior Member

    Messages:
    3,322
    "Should we or should we not" in the context of what? No particular system or school of thought at all that this is constrained to? Just inviting a random mixture of garbled folk views spouting techno-utopian platitudes and guidance abstracted from pop-culture?

    To allow artilects might even be incompatible with this outlook or school of thought. Should it be humans becoming extinct by ceasing to reproduce that they strictly prescribe rather than our eradication by anything that happens to come along. Especially if what replaces us continues the same habits they dislike, but in grander fashion enabled by faster processing, reproduction, and efficiency. (Could AI robots develop prejudice on their own? ... Self-replicating robots consuming a planet's abundance even more rapidly than humans)

    Technophobic cultural orientations are surely going to blossom beyond just typical plain people and Neo-Luddism reservations. As smart machines take away more and more jobs. We should probably recognize the emergence of those philosophies, lifestyles, and population groups and consider their objections in advance when deciding their fate for them. Since this is an era where we're so gung-ho about letting minority input manipulate the majority (tail wagging the dog). Will it possible to give them -- say, the defrosted continent of Antarctica or unthawed Greenland as a refuge for their desire to remain in a pre-artilect lifestyle? (Only a few million of them, maybe, but they'll still need a land mass worth of resources to maintain at least a green version of a 20th-century lifestyle as a cut-off point.)

    Or perhaps more importantly, would the smart machines and cyborgs inheriting the world respect that agreement long after it was forged? The initial signs are that they are not going to be anymore competent at being "moral" than "smart" people were. "Intelligent like us" and "being more intelligent than us" entails being able to escape purely invented boxes (especially with regard to discerning that such is what they indeed are).
     
  18. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    7,985
    I see humans as nuthin more than a particular stage in evolution… an will likely become extinct soomer than later in favor of machines.!!!

    Are you a creationist type… an beleive that humans will not evolve.???
     
  19. Speakpigeon Valued Senior Member

    Messages:
    1,123
    Derail.
    EB
     
  20. Speakpigeon Valued Senior Member

    Messages:
    1,123
    Again, fallacious reasoning.
    The fact that we evolve is no good reason to stop thinking about what we should do. I don't think you yourself stopped thinking about what you should do in life just because you know you are going to die. Evolution is really long term and AIs are short term. It would be idiotic to motivate ignoring the short-term problem by the fact we also have another problem in a very distant future. Further, how would evolution be a problem at all?
    EB
     
  21. C C Consular Corps - "the backbone of diplomacy" Valued Senior Member

    Messages:
    3,322
    His not caring about what happens due to evolution or any other _X_ is a legitimate stance since you didn't set a standard and context for opinions to conform to and be evaluated by in this thread. "... Anything goes." --Cole Porter
     
    Last edited: Oct 8, 2019
  22. Speakpigeon Valued Senior Member

    Messages:
    1,123
    I don't remember having asked that his post be deleted because it offended by religious beliefs or something. Of course he can say what he likes. Who cares?
    I said "fallacious". Fallacious reasoning is fallacious reasoning irrespective of how I introduce the topic. And he can do it. And can say what he does is fallacious. See?
    EB
     
  23. cluelusshusbund + Public Dilemma + Valued Senior Member

    Messages:
    7,985
    The evolution of AI’s bein inevitable dont stop people from thankin… an among ‘em are people who rail aganst advancment… yet... in spite of nay-sayers worry about prollems… the march toward AI continues.!!!

    Some people just dont do well wit change… or is you’r beleif that the impossibility of machines becomin more intelligent that humans based on religious type reasons.???
     

Share This Page