Viewing a single comment thread. View all comments

Surur t1_j55s75d wrote

I feel that symbolic thinking still needs to be solved, but maybe this is an emergent property.

13

croto8 t1_j56jl1g wrote

I think symbolic thinking may be inextricably linked to a sense of self. To give AI what we think of as understanding requires context and the perceiver’s acknowledgement of the symbol in a larger setting, rather than just pattern recognition at scale.

4

EVJoe t1_j572xpw wrote

Consider synesthesia, the phenomenon wherein a sensory stimulus in one channel (let's say hearing) activates a sensory perception in another sensory channel (let's say vision).

Imagine you have synesthesia, and you're a pre-linguistic human surviving in the wild. You hear a tiger roar, and via synesthesia you also "see" bright red, then a member of your tribe gets eaten by a tiger while others flee.

For such a person, "seeing" red now has personal symbolic meaning associated with tiger danger.

Symbolism does not need to derive from culture or a larger system. All you need is the capacity to recognize patterns between various stimuli. What that looks like for a "mind" that isn't limited to human sensation is another question entirely

0

croto8 t1_j573xp6 wrote

Your example uses personal experience to create the symbolic representation and subsequent association. Kind of my point.

Edit: to further, pattern recognition could create a similar outcome, through using training data which has this symbolic pattern inherently, but without the personal experience, sense of risk, and context, it’s just gradient descent based on the objective function that was configured to emulate the process.

1