Wolkrast

Wolkrast t1_ja7o3mz wrote

>The reason why AI can’t love anything or yearn to be free is because it has no body. It has no source of feeling states or emotions, and these somatic feelings are essential for animal consciousness, decision-making, understanding, and creativity. Without feelings of pleasure and pain via the body, we don’t have any preferences.

The article makes a number of very strong claims here. At the very least we know that AI is capable of decision-making, in fact that is the only thing it is designed to do.

The heart of the argument seems to be less about a body - after all a robot with onboard AI would fulfill that definition, which is clearly not what the author is talking about - but about the difference between decisions motivated by logic versus decisions motivated by feelings. This begs the question how for example pain avoidance is different to optimizing a value function to avoid things that deduct from it's score. From outside, there is no way to observe that difference, because all we can observe is the behavior, not the decision making process.

We should remember that until as recent as 1977, animals were generally considered as mere stimulus reaction machines. Today you'd be hard pressed to find a scientist arguing that animals are not conscious.

4

Wolkrast t1_ja7i41r wrote

So you're implying what's important is the ability to adapt, not the means by which the body came into existence?
There are certainly algorithms around today that are able to adapt to a variety of circumstances, and to not influence one's environment sounds conceptually impossible.
Granted, the environments we put AIs into today are mostly simulated, but there is no reason other than caution we shouldn't be able to extrapolate this into the real world.

2