Viewing a single comment thread. View all comments

Talik1978 t1_j9ainey wrote

>My question is — if it actually had the means do these things, what would it take to switch from hypothetical to reality? One rogue programmer getting rid of an IF statement or row of training data?

With self learning AI, it's entirely possible that the program learns to make that change itself.

AI is good at.doing what we ask it to, but that is not the same as doing what we want it to. As an example, programmers trained an AI to control a cleaning bot. It was trained on a reward model, where it received positive reinforcement whenever it couldn't detect a mess, and negative reinforcement whenever it could.

What did it learn to do? It covered its cameras with a cleaning bucket. Easy, efficient, and now it is constantly being rewarded, as it cannot detect any messes.

1