Viewing a single comment thread. View all comments

dmarchall491 t1_j6p00sx wrote

> Or perhaps we overestimate what exactly consciousness is?

Certainly, however that's not the issue here. The problem with language model is simply that it completely lacks many fundamental aspects of consciousness, like being aware of its environment, having memory and stuff like that.

The language model is a static bit of code that gets some text as input and produces some output. That's all it does. It can't remember past conversations. It can't learn. It will produce the same output for the same input all the time.

That doesn't mean that it couldn't be extended to have something we might call consciousness, but as is, there are just way to many import bits missing.

13