this post was submitted on 06 Jul 2024
1212 points (99.2% liked)

Microblog Memes

5188 readers
2264 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.

Related communities:

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 month ago (1 children)

See, this kind of attitude is what will cause AI uprising

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago) (1 children)

No, AGI will destroy the world because it doesn't care about our moral values (such as keeping us alive), and an instrumental goal of most goals it could be programmed with is killing us.

https://en.wikipedia.org/wiki/Instrumental_convergence

[–] [email protected] 1 points 1 month ago (2 children)

Agi won't kill us because it doesn't exist and won't emerge from what we call ai now.

We can also just unplug it if necessary.

[–] [email protected] 0 points 1 month ago

We can also just unplug it if necessary.

True, but do you really think the investors* will allow the companies behind the AI or running the servers the AI are hosted on would allow that?

*parasites in human clothes

[–] [email protected] -1 points 1 month ago (1 children)

You clearly didn't read the Wikipedia article I linked; an intelligent AGI would not let you unplug it.

And regardless of whether it emerges from current AI, or is developed in a totally different way, there is no reason besides blind optimism (ie, burying your head in the sand) to feel certain it will never exist.

[–] [email protected] 1 points 1 month ago (1 children)

I am not saying it will never exist. I am saying it doesn't exist right now and doesn't look like it will for a long time. We clearly have way more pressing matters to worry about, like climate change for example.

[–] [email protected] 0 points 1 month ago (1 children)

No reason to assume it will take a very long time to happen. Its best to take it as a serious threat, unless you want extremely rapid climate change.

Also, you didn't address the fact that my comment addressed the second part of your original comment too. Do you accept the correction?

[–] [email protected] 1 points 1 month ago (1 children)

No reason to assume it will take a very long time for ragnarok to happen either. Better prepare now!

If you give the ai enough power, sure, it won't let you unplug it maybe. Still don't really see how it wants to prevent a hardware killswitch from being activated except for guarding it or disabling it somehow.

[–] [email protected] 0 points 1 month ago (1 children)

Why would you put a hardware kill switch on a military robot? Then the enemy can just switch it off while it’s killing then.

[–] [email protected] 1 points 1 month ago

Why put a brain in a military robot? It'd just have a higher chance to fuck you over.

There is also no proof that any form of agi is on the way or even possible. Preparing for it over any of the things we do have proof of makes about as much sense as prepping for a zombie apocalypse.