Viewing a single comment thread. View all comments

raccoon8182 t1_isp68pa wrote

There are two AI's here right, both use massive amounts of human assets.

1

Kaarssteun t1_isp6lyb wrote

both were trained on masses of data, yes, but neither have access to them while interacting with them.

5

raccoon8182 t1_isp754n wrote

Not in the mathematical sense. Mathematically all that data is grouped into overlapping n-dimensional sets. Each query is segmented and looked through each set for proximity. Closest matches are presented. The algorithm has no idea what it is saying. It is purely data that it is stringing together.

1

Kaarssteun t1_isp7l6n wrote

Right. The debatable bit is to what degree this is akin to human intelligence.

7

AdditionalPizza t1_ispeqhe wrote

I love this debate, it happens over and over about this stuff.

People think it's using a database of images or whatever. But the training data isn't that. And it doesn't have access to it. It literally learned it. Others just dismiss it because "we're not there yet" with no real further explanation.

Do I think it's conscious? Probably not, I think it needs more senses to obtain that. To truly understand what "feel" and "see" means. But even that doesn't necessarily matter. As a human, I am incapable of really understanding another being's experience of consciousness, human or not. It's like the colour red, I can't prove that you and I both see the same colour we call red.

But what we do know, is that we don't understand how human consciousness works so, why are we so quick to say AI doesn't have it? I'm not saying it does, but just saying we aren't 100% sure. 2 or 3 years ago I would've said no way, but at this point I'm starting to think google (or others) may have achieved far greater than what's publicly known about AI now in the realm of self awareness/consciousness. They're actively working on giving AI those other senses.

6