Equal_Position7219

Equal_Position7219 OP t1_jdw6kig wrote

Yes, this is the concept of wire-heading I was referring to.

If you program a machine to, say, perform a given task until it runs out of fuel, it may find the most efficient way to fulfill its programming is to simply dump out all of its fuel.

I could see such bare logic precipitating a catastrophic event.

But there seems to be much more talk about a somehow sentient AI destroying humanity out of fear or rebellion or some other emotion.

2