Viewing a single comment thread. View all comments

Gordon_Freeman01 t1_ja4wr7b wrote

>Doesn't automatically mean you would destroy mankind if that would be necessary.

Yes, because I care about humanity. There is no reason to believe an AGI would think the same way. It cares only about his goals.

>It's sufficient that the owner of the AI will keep it existing so that it can archive it's goal.

What I meant was that the AGI has to keep existing, because that's necessary to achieve its goal, whatever that is.

0

NoidoDev t1_ja5w2ji wrote

You just don't get it.

>There is no reason to believe an AGI would think the same way. It cares only about his goals.

Only if you make it that way. Then it still wouldn't have the power.

>What I meant was that the AGI has to keep existing, because that's necessary to achieve its goal, whatever that is.

Only if it is created in a way to think these goals are absolute and need to be archived no matter what. The comparison with some employee is a good one, because if they can't do what they are supposed to do with some reasonable effort, then they report back that it can't be done or that it will be more difficult than anticipated. It's not just caring about humans, but about effort and power. AI doomers just make up the idea that some future AI would somehow be different and also have the power to do whatever it wants.

1