t0mkat

t0mkat t1_je7y77s wrote

It would understand the intention behind its creation just fine. It just wouldn’t care. The only thing it would care about is the goal it was programmed with in the first place. The knowledge that “my humans intended for me to want something slightly different” is neither here nor there, it’s just one interesting more fact about the world that it can use to achieve what it actually wants.

3

t0mkat t1_j9rcdh3 wrote

Am I missing something here? We haven’t solved the alignment problem yet, and if we still haven’t by the time we get to AGI then we’re all going to be killed. So that’s what I expect out of AGI.

−2