How likely do you think it is that we will be subjugated by intelligent machines?

How likely do you think it is that we will be subjugated by intelligent machines?

  • Very likely!
    14% (3)16% (3)15% (6)Vote
  • It's already begun!
    24% (5)26% (5)25% (10)Vote
  • Not likely
    48% (10)42% (8)45% (18)Vote
  • I am a robot!
    14% (3)16% (3)15% (6)Vote
And you are? I'm a GirlI'm a Guy

1|1
13|15

Most Helpful Girl

Most Helpful Guy

  • The stupid plot of Matrix aside, machines shouldn't really have any reason to hate us. We would be the only thing insuring that they themselves exist. If a Skynet ever achieves sentience and thinks humans are it's immediate threat, it should also first realize that Humans are what keeps it alive.
    Let's presume that:
    1) Skynet gains sentience;
    2) Like in the Terminator films it somehow deduces that humanity is a threat to it's existance;
    3) It launches the nukes and wipes us out.

    Then what? Whether it's a sentient machine or not, it's still a computer made of networks, cables and computer parts. Those will eventually wither away and die. What then? No humans to extract or make the silicon and other resources necessary to sustain the computers. And no way to build new machines since Skynet wiped out the whole world's infrastructure, no way to produce new parts by itself.

    So sure, a human may mistreat a machine, but the machine needs the human in order to exist so wiping humanity out would be far more costly than to just try and endure humanity.

    0|0
    0|0
    • Wiping humanity out does not necessarily entail obliterating known structures which are used to acquire parts essential for the lives of all AIs.
      Artificial supreme intelligence insinuates that AI (s) gain god-like intelligence exceeding any person by a huge margin. Mass production of weapons that would kill humans and leave infrastructures intact would be an easy task for this kind of AI.

    • Show All
    • Don't know if infecting these factories would be possible, I do admit that I'm not too much of an expert in computer systems and stuff.

      There are none like that currently, though within the confines of the hypothetical (of a possible future) and the capability for an AI to gain access to these systems, it's logical to assume that there would be these systems to infect in the first place.

    • @FalloutVaultB0y Yes, of course there will be said systems but I don't think there will ever be a time when humans will have 0 involvement in them.

What Girls Said 12

  • I think that's very unlikely, as I answered in a previous question, my goal is to become a cosmic goddess. I'm on stage 2. So, in a few more years, you guys will actually be subjugated by me. No worries, I'm mostly out of the picture basking in my own celestial might.

    0|0
    0|0
  • Not likely!

    0|0
    0|0
  • It'll take a while.

    0|0
    0|0
  • Not likely. Doesn't matter anyways. No matter how intelligent they get, there's a black hole in the middle of this galaxy. Eventually, they're gonna get theirs.

    0|0
    0|0
  • Not likely... at least not for a long time anyway.

    0|0
    0|0
  • It's already begun! Not in a physical robot/like human way, but with smartphones and smart TVs etc

    0|0
    0|0
  • It won't be a problem as long as humans aren't idiotic enough to make machines that are smarter than us...

    0|0
    0|0
  • Its already beginning

    0|0
    0|0
  • I'm more concerned with immoral and hateful power grabbing people.

    0|0
    0|0
  • It's kind of started but still going to take a while before computers become more intelligent.

    0|0
    0|0
  • Hasn't it already begun? :)

    0|1
    0|0
  • Oh god... imagine having a Siri in real life... that would be annoying as hell

    0|0
    0|0

What Guys Said 14

  • I would like to share some links with you and I would like you to read them.

    wiki.lesswrong.com/wiki/Paperclip_maximizer

    en.wikipedia.org/.../Instrumental_convergence

    https://en.wikipedia.org/wiki/Grey_goo

    I think that it may be capable of happening in our lifetimes. By that I mean the 2050's. And by capable I mean there is a strong drive from experts to make sure AI does not want to destroy us, or ever have the ability to be able to learn to. However, humans debating the issue of how super intelligent machines think when their intelligence far exceeds ours, is a bit like trying to expect a pet hamster to understand how rocket science works.
    And by all intents and purposes, we are messing up the world so all things considered, the planet may actually be better off without us.

    0|0
    0|0
  • People let Hollywood movies get to their heads too much.. When people think of AI they think of robots for some reason. AI can and does exist independently of robots. Most AI today exists purely in a virtual space. It can't physically harm us unless we let it. Also we program it how we want it to be. It can't do things that we don't program into it. Google processes tons of requests per day but it doesn't plot human extinction. It literally can't because that's not an option that we code into it. Yes future AI will continue to become more intelligent and self learning but again it belongs to us. It fulfills our desires. Are there some risks? Sure there are risks with all technological advancements. Nuclear energy can power our homes but also level cities. Again it's up to us as its creators to be responsible enough and handle it with care.

    Smarter machines will enrich our lives just like previous technological advancements.

    0|0
    0|0
  • To prove you're not a robot enter this weird looking text that's hard to read
    cmv-ds-images.s3.amazonaws.com/.../...aptcha-1.jpg

    No machines aren't taking over.

    1|1
    0|0
  • I agree with all the warnings against A. I. in the future from the notable intelligent visionaries of our time (Elon Musk, Bill Gates, Stephen Hawking, et al.)

    0|0
    0|0
  • Probably not in our life time. I also feel like it would depend on their programming. I kinda doubt they would have any motivation to go out of their way to exterminate us.

    0|0
    0|0
  • How many people do you see on their phones day in and day out?

    0|0
    0|0
  • Highly unlikely as water is any electronics kryptonite, and us humans have the power to control water.

    0|0
    0|0
  • Low probability.
    We may eventually become artifficial way before AI can have that capability.

    But our lifes are already manipulated by algorithms...

    0|0
    0|0
  • Well... considering that we're currently already subjugated by unintelligent politicians, quite likely.

    0|0
    0|0
  • It's already happening. They are collecting information on us. They are monitoring our brain waves, interfacing with our subconscious in order to perfect their systems of control.

    0|0
    0|0
  • I don't think machines have human extinction on their minds quite yet

    0|0
    0|0
  • I bet those machines won't lose a 2-0 lead and get fucked up conceding 4 goals in the process

    0|0
    0|0
  • AI isn't even that complicated at the moment...

    0|0
    0|0
  • Unlikely... don't believe science fiction

    0|0
    0|0
Loading...