Viewing a single comment thread. View all comments

Equal_Position7219 OP t1_jdw6kig wrote

Yes, this is the concept of wire-heading I was referring to.

If you program a machine to, say, perform a given task until it runs out of fuel, it may find the most efficient way to fulfill its programming is to simply dump out all of its fuel.

I could see such bare logic precipitating a catastrophic event.

But there seems to be much more talk about a somehow sentient AI destroying humanity out of fear or rebellion or some other emotion.

2

Surur t1_jdw97qs wrote

Emotion is just a diffuse version of more instrumental facts.

e.g. fear is risk of destruction, love is recognition of alliance, hate of opposing goals etc.

3