Viewing a single comment thread. View all comments

crt09 t1_j9tncbf wrote

"Unsure what kind of goal the AI had in this case"

tbf pretty much any goal that involves you doing something on planet Earth may be interrupted by humans, so to be certain, getting rid of them probably reduces the probability of being interrupted from your goal. I think its a jump that itll be that smart or that the alignment goal we use in the end wont have any easier way to the goal than accepting that interruptibility, but the alignment issue is that it Wishes it was that smart and could think of an easier way around

3