Viewing a single comment thread. View all comments

alsuhr t1_j1zftk4 wrote

Language is an action we take to achieve some short- or long-term intent by affecting others' actions. It just so happens that text data is (mostly) symbolic, so it appears like only a problem of symbol manipulation. The text that these models are trained on are observations of language production, where utterances are generated from intent (e.g., wanting to convince someone of some argument, wanting to sell something to someone) and context (e.g., what you know about your interlocutor). This doesn't even cover vocal / signed communication, which is much more continuous.

Intent and context are not purely symbolic. Sure, with infinite observations, that generative structure would be perfectly reconstructable. But we are nowhere near that, and humans are completely capable of modeling that generative process with very little data and continuous input (which we learn to discretize).

10

maxToTheJ t1_j1zq0fa wrote

> Intent and context are not purely symbolic.

Yup . Thats why reasoning comes in and what makes what Demis from DeepMind said make sense

2