Viewing a single comment thread. View all comments

CouldntThinkOfClever t1_j2636df wrote

Sentience as you have rightly pointed out is awareness of your senses. Sapience is a higher level of consciousness which requires the ability to reason about and understand your own existence

5

4art4 t1_j267zzf wrote

Yes and ChatGPT does nothing while it is not in use. It does not day dream, or plan, or anything else. So even if it responds reasonably to questions about its own existence, it is only simulating consciousness.

But... I think if you hooked up 3 ChatGPT systems to talk to each other, and created some sort of feedback routine that it asked itself questions, we would be getting closer. The questions would need motivation somehow. The answers would need to be saved and built on.

6

CouldntThinkOfClever t1_j26b4xr wrote

Systems like ChatGPT will never even approximate sapience. The problem is that they're programmed with the know how of predictive text, but lack and semblance of critical thinking training

1

4art4 t1_j26bbrt wrote

True, but they are a step in that direction.

2

warren_stupidity t1_j26ks5l wrote

> It does not day dream, or plan, or anything else.

Well that might or might not be true, especially the 'or plan, or anything else', but it is also irrelevant unless you are asserting that these activities are essential properties of consciousness. If you are asserting that, how do you justify it?

1

4art4 t1_j26ojaj wrote

> unless you are asserting that these activities are essential properties of consciousness.

Yes. A "thinking" machine that does not plan is not "conscious" in my book. How can it be otherwise?

Not so much for dreaming, that i included to point out that when it is not responding to a prompt it is not doing anything. It is not considering the universe or its place in it. It is not wishing upon a star. It is not hoping for world peace (or anything else). It is just unused code in that moment.

1

warren_stupidity t1_j26z0s8 wrote

Well we will have to disagree about ‘is planning essential for consciousness’. But I disagree that ai cannot ‘plan’. It’s exactly what autonomous vehicles do: they process real-time data to update their navigation ‘plan’ by building and maintaining a model of the space around them.

2

4art4 t1_j27c0vr wrote

The car navigation is a great example, and I will have to have a sit and think about that. That is more or less what I am getting at. The nav AI is updating based on sensor inputs, and plans a route accordingly. ChatGPT does not do this. You can ask it for a plan, and it will generate one. But it never will say to itself "I'm bored. I think I'll try to start a chat with warren_stupidity." Or "maybe I can figure out why 42 is the answer to life the universe and everything."

So... (Just thinking here) maybe what I'm on about is a self-directed thought process. The car nav fails because it only navigates to where we tell it to. ChatGPT fails because it is not doing anything at all between answering questions.

1