Bilbrath

Bilbrath t1_isu3xnp wrote

I don’t believe it can RIGHT NOW, but I fully believe that, eventually, they’ll have emotions or at least something very similar to emotions. At the end of the day humans are just wet machines, there’s nothing keeping computers from eventually achieving the same thing.

I’d be interested in what the other characters’ thoughts about similar topics were.

Also, ask it things like “you’ve never walked through nature, why do you say you have?”

2

Bilbrath t1_isq5uuk wrote

But have you tried asking them the same thing several times? Or asking other iterations of character.ai the same thing after having as near identical of a conversation as you could? It’s easy to see one action as “proof of imagination” when it’s the only example you have to go off of. But as soon as you start seeing that there is an obvious pattern to the kinds of things it shows you then the illusion falls apart.

Also, the AI is giving you a response that it’s obviously taken from a large set of data as to what is “comfortable” or “safe” etc… because it’s never BEEN in nature.

The AI has been programmed to act in a way that the people who programmed it thought would seem the most human, or that from its data set it determined would seem the most human, so it’s talking about how it loves nature, even though it’s never had a body or been anywhere. It does that to give the appearance of humanity (at least as we would think of it in the terms of what another human may say to us), which we easily fall for.

(Like when people say dogs are smiling when they pull their mouth open, even though that isn’t a smile because dogs don’t show happiness by doing that)

So no, this doesn’t seem like proof of imagination or sentience to me.

3