Viewing a single comment thread. View all comments

OneRedditAccount2000 OP t1_ir9elv0 wrote

Downvoting without even bothering to make a counter argument is childish.

The point is simple: If I'm the big boy here, why should I let the little boys rule the world? And when I rule the world, why should I keep the little boys around, if I don't need them, since I can do all the work on my own? Out of mere empathy? Couldn't ASI just yknow get rid of empathy?

If ASI values survival it has to make the least risky choices that are available. If human beings found an asteroid that had a 1% chance of hitting the earth, and we were able to destroy it, we wouldn't take the risk just because the asteroid is pretty.

If (many) human beings become ASIs, through some brain implant/consciousness uploading technology, then you just have scenario number two where the ones that are already super intelligent have no use for the inferior class of fully organic homo sapiens, and will subjugate them and/or get rid of them.

0

Mokebe890 t1_ir9fhy0 wrote

Survival instinct only applies to living being, no one knows what ASI would think. Its super intelligence, something above our level of intelligence, you really can't guess what it would thought. Are people concerned about ants life? No because nothing bothers them.

Upgraded homo sapiens will win with normal one, just like homo sapiens won with neandenthral, nothing strange.

2

OneRedditAccount2000 OP t1_ir9iyel wrote

And you think (or hope) you will be one of the lucky ones, that's why you're here, right? You're rich and privileged and you know can buy your way into immortality and virtual reality vacations with sex robots While most of us will perish?

And if that's not the case, may I ask why you admire something that's hostile to you?

−1

Mokebe890 t1_ir9lep9 wrote

Sure, why not? It will be expensive but not "only 0.01%" expensive. Also humans will be sceptical about it first, reject technologies and improvments.

Technology will lower its cost as it always does. And when there is no problem with resources what bother you about lower developed? What bother you about native tribes living in Amazon forest?

If Id be immortal super being then absolutly won't do nothing bad to humanity because I wont ever bother.

What's hostile? Super intelligence? That it would be fractions more inteligent than we're and won't even think about human anihilation?

3

Zamorak_Everknight t1_ir9iibl wrote

>If ASI values survival it has to make the least risky choices that are available

Who programmed that into it?

In any AI agent, there is a goal state. Or multiple goal states with associated weights. It will try to get the best goal state fulfilment "score" while avoiding the constraints.

These goal states, constraints, and scoring criteria are defined by the developer(s) of the algorithm.

I highly recommend taking one of the Intro to Artificial Intelligence courses available on Coursera.

1

OneRedditAccount2000 OP t1_ir9k166 wrote

If it's just a tool, like a nuclear weapon, what prevents the first group of people that invents it to use it to take over the world and make big $$$? And once this group of people realizes that they don't need 8* billion parasites, they can just make a borg-society that works for them for free, what prevents this group to ask their God to make them invisibly small and lethal drones to kill the useless and dangereous humanity?

Do you really believe this group would find any use for you and me, or humanity as a whole? Isn't the destruction of society as we know it inevitable, either way?

1

Zamorak_Everknight t1_ir9ke9l wrote

If we are picturing doomer scenarios, then in that context I agree that it really isn't that different from, as you said, nuclear weapons.

Having said that, we seem to have a pretty good track record of not blowing up existence with our arsenal of nukes over the last ~ century.

1

OneRedditAccount2000 OP t1_ir9rezv wrote

There have been nuclear disasters that have affected the well being of enough people. And we were one button away from ww3 (Stanislav Petrov) once.

And you're certainly ignoring the fact that the reason why ww3 never happened has a lot to do with the fact that MAD was always a thing since more than one group of people/country started to make and test nukes. .

In this scenario one group invents ASI first, which means they have a clear advantage over the rest of humanity that doesn't yet have it and can't fight back against it. The next logical step is to exterminate/subjugate the rest of humanity to gain power, control over the whole planet.

ASI can create autonomous slave workers, so the group has no incentive to sell you ASI because they're better off keeping it to themselves and getting rid of everyone else that also wants it.

1

Zamorak_Everknight t1_irc8t3e wrote

>The next logical step is to exterminate/subjugate the rest of humanity to gain power, control over the whole planet.

How... is that the next logical step?

1

OneRedditAccount2000 OP t1_ird7dur wrote

Because they want to rule/own the world and live forever? Can you do that if there are states? Don't you need to live in an environment where you're not surrounded by enemies to pull that off? lol

I'm not saying they'll necessarily kill everybody, only those that are a threat. But when you have a world government that's controlled by you, inventor of the ASI and all your friends, if you can even get there without a nuclear war, won't you eventually want to replace the 8 billion biological human beings with something else?

The answer is literally in the text you quoted

1