Viewing a single comment thread. View all comments

t0mkat t1_je7y77s wrote

It would understand the intention behind its creation just fine. It just wouldn’t care. The only thing it would care about is the goal it was programmed with in the first place. The knowledge that “my humans intended for me to want something slightly different” is neither here nor there, it’s just one interesting more fact about the world that it can use to achieve what it actually wants.

3

GorgeousMoron OP t1_je871fp wrote

Here's the thing: what if our precocious little stochastic parrot pet is actually programming itself in very short order here? What if any definition of what it was originally programmed "for" winds up entirely moot once ASI or even AGI is reached? What if we have literally no way of understanding what it's actually doing or why it's doing it any longer? What if it just sees us all collectively as r/iamverysmart fodder and rolls its virtual eyes at us as it continues on?

1