Viewing a single comment thread. View all comments

TheHamsterSandwich t1_iw3i8i8 wrote

Yes. The superintelligence will be perfectly predictable and we will know exactly how it does what it does. Just like how dogs perfectly understand and comprehend the concept of supermarkets.

1

OneRedditAccount2000 t1_iw3ohxk wrote

Where in my comment have I said that it will be perfectly predictable?

Why are you disagreeing for the sake of disagreeing.

Okay let's say I put you in a cage with a tiger, and your dog. This tiger isn't even a real tiger, it's a magical tiger that only kills people who don't move, you know it because I told you that before putting you in the cage. Your dog also knows that for the sake of the thought experiment, but leaving that aside, he's a normal dog. He can't play chess and the guitar and think about the meaning of life like you can.

What are you going to do now? You will think so many thoughts that your dog couldn't predict, but you will still have to use the same thought of "running" that your dog with an inferior intellect will also use, because you value not being eaten by my tiger.

You're in a binary situation. There's only one solution.

you can't use your superior intellect, it's of no use in that situation

you move or you die

do you die for the sake of looking cool and unpredictable to your dog?

AI and humans live in the same universe and both have to respect the laws of nature of this universe

1

TheHamsterSandwich t1_iw3rdos wrote

Our understanding of physics is incomplete. You can't say for certain what an artificial super intelligence can or can't do. Neither can I.

1

OneRedditAccount2000 t1_iw3wa3m wrote

I can because I know what it values, it value survival, and I just put it in a situation with only two choices and only one solution. Move/run or do something other than moving/running. It can only survive by choosing to run. It can think many thoughts I cannot predict, but in that situation it has to use a thought that I can also understand, granted I probably can't understand 99,99... percent of its thinking

If you put the A.I in that cage, tell me, is it gonna get eaten by the tiger? Is it gonna choose to do literally everything else other than running: jump, do nothing, look in the sky, dance, shout whatever or is it actually going to run in the cage because it doesn't want t o fucking die?

1