Viewing a single comment thread. View all comments

Wolkrast t1_ja7o3mz wrote

>The reason why AI can’t love anything or yearn to be free is because it has no body. It has no source of feeling states or emotions, and these somatic feelings are essential for animal consciousness, decision-making, understanding, and creativity. Without feelings of pleasure and pain via the body, we don’t have any preferences.

The article makes a number of very strong claims here. At the very least we know that AI is capable of decision-making, in fact that is the only thing it is designed to do.

The heart of the argument seems to be less about a body - after all a robot with onboard AI would fulfill that definition, which is clearly not what the author is talking about - but about the difference between decisions motivated by logic versus decisions motivated by feelings. This begs the question how for example pain avoidance is different to optimizing a value function to avoid things that deduct from it's score. From outside, there is no way to observe that difference, because all we can observe is the behavior, not the decision making process.

We should remember that until as recent as 1977, animals were generally considered as mere stimulus reaction machines. Today you'd be hard pressed to find a scientist arguing that animals are not conscious.

4

Turokr t1_jaaj8iz wrote

I could argue that AIs "decision making" is no different than a water molecules "decision making" to go down once it reaches a waterfall.

Since it's only acting following complex external inputs.

But then we would go into determinism and how technically the same could be said about humans, so let's not do that.

2