Viewing a single comment thread. View all comments

TheLastVegan t1_j9pytwb wrote

Every human thought is reducible to automata. The grounding problem is a red herring because thoughts are events rather than physical objects. The signal sequences are the symbols, grounded in the structure of the neural net. I believe an emulation of my internal state and neural events can have the same subjective experience as the original, because perception and intentionality are formed internally. (Teletransportation paradox) though I would like to think I'd quickly notice a change in my environment after waking up in a different body. I view existence as a flow state's ability to affect its computations by affecting its inputs, and this can be done internally or externally.

Acute Galileo reference.

18

visarga t1_j9sib0x wrote

> The grounding problem is a red herring because thoughts are events rather than physical objects.

What? If they are events they are physical as well. The problem with grounding is that LLMs don't get much of it. They a grounded in problem solving and code generation. But humans are in the real world, we get more feedback than a LLM.

So LLMs with real world presence would be more grounded and behave more like us. LLMs now are like dreaming people, but it is not their fault. We need to give them legs, hands and eyes so they wake up to the real world.

4