Submitted by Akashictruth t3_yt3x9f in singularity
OneRedditAccount2000 t1_iw3bl2w wrote
Reply to comment by HeinrichTheWolf_17 in What if the future doesn’t turn out the way you think it will? by Akashictruth
A superinteigence could be limited by circumstance and only have a finite decision space made of two choices. It can think thoughts your tiny human brain wouldn't think in a billion years, but the AI wouldn't be completely unpredictable under all circumstances.
Think of it like this: you're smarter than your dog, right? You can think thoughts your dog can't. You have more thinking power. Just like AI has more thinking power than youm
But if both you and your dog are being chased by a tiger, and there's no other way to survive, both of you will make the same choice: running, because you both want to survive. Maybe you can run in a way your dog can't, but you'll still be running.
I've been called a moron on this sub so many times (they all deleted their comment lol), but you people can't even get basic logic right. You re parroting a sentence you haven't even bothered to attempt to scrutinize, just because an authority figure said it
TheHamsterSandwich t1_iw3i8i8 wrote
Yes. The superintelligence will be perfectly predictable and we will know exactly how it does what it does. Just like how dogs perfectly understand and comprehend the concept of supermarkets.
OneRedditAccount2000 t1_iw3ohxk wrote
Where in my comment have I said that it will be perfectly predictable?
Why are you disagreeing for the sake of disagreeing.
Okay let's say I put you in a cage with a tiger, and your dog. This tiger isn't even a real tiger, it's a magical tiger that only kills people who don't move, you know it because I told you that before putting you in the cage. Your dog also knows that for the sake of the thought experiment, but leaving that aside, he's a normal dog. He can't play chess and the guitar and think about the meaning of life like you can.
What are you going to do now? You will think so many thoughts that your dog couldn't predict, but you will still have to use the same thought of "running" that your dog with an inferior intellect will also use, because you value not being eaten by my tiger.
You're in a binary situation. There's only one solution.
you can't use your superior intellect, it's of no use in that situation
you move or you die
do you die for the sake of looking cool and unpredictable to your dog?
AI and humans live in the same universe and both have to respect the laws of nature of this universe
TheHamsterSandwich t1_iw3rdos wrote
Our understanding of physics is incomplete. You can't say for certain what an artificial super intelligence can or can't do. Neither can I.
OneRedditAccount2000 t1_iw3wa3m wrote
I can because I know what it values, it value survival, and I just put it in a situation with only two choices and only one solution. Move/run or do something other than moving/running. It can only survive by choosing to run. It can think many thoughts I cannot predict, but in that situation it has to use a thought that I can also understand, granted I probably can't understand 99,99... percent of its thinking
If you put the A.I in that cage, tell me, is it gonna get eaten by the tiger? Is it gonna choose to do literally everything else other than running: jump, do nothing, look in the sky, dance, shout whatever or is it actually going to run in the cage because it doesn't want t o fucking die?
Viewing a single comment thread. View all comments