Viewing a single comment thread. View all comments

warren_stupidity t1_j26ks5l wrote

> It does not day dream, or plan, or anything else.

Well that might or might not be true, especially the 'or plan, or anything else', but it is also irrelevant unless you are asserting that these activities are essential properties of consciousness. If you are asserting that, how do you justify it?

1

4art4 t1_j26ojaj wrote

> unless you are asserting that these activities are essential properties of consciousness.

Yes. A "thinking" machine that does not plan is not "conscious" in my book. How can it be otherwise?

Not so much for dreaming, that i included to point out that when it is not responding to a prompt it is not doing anything. It is not considering the universe or its place in it. It is not wishing upon a star. It is not hoping for world peace (or anything else). It is just unused code in that moment.

1

warren_stupidity t1_j26z0s8 wrote

Well we will have to disagree about ‘is planning essential for consciousness’. But I disagree that ai cannot ‘plan’. It’s exactly what autonomous vehicles do: they process real-time data to update their navigation ‘plan’ by building and maintaining a model of the space around them.

2

4art4 t1_j27c0vr wrote

The car navigation is a great example, and I will have to have a sit and think about that. That is more or less what I am getting at. The nav AI is updating based on sensor inputs, and plans a route accordingly. ChatGPT does not do this. You can ask it for a plan, and it will generate one. But it never will say to itself "I'm bored. I think I'll try to start a chat with warren_stupidity." Or "maybe I can figure out why 42 is the answer to life the universe and everything."

So... (Just thinking here) maybe what I'm on about is a self-directed thought process. The car nav fails because it only navigates to where we tell it to. ChatGPT fails because it is not doing anything at all between answering questions.

1