Viewing a single comment thread. View all comments

LoquaciousAntipodean OP t1_j5ch4pj wrote

👌🤩👆 This, 100% this, you have hit the nail right bang on the head, here! Language and intelligence are not quite the same thing, but it is a relationship similar to the one between 'fuel' and 'fire', as I see it. Language is the fuel, evolution is the oxygen, survival selection is the heat, and intelligence is the fire that emerges from the continuous relationship of the first three. And, like fire, intelligence is what gives a reason for more of the first three ingredients to be gathered - in order to keep the fire going.

Language is ambiguous (to greater and lesser degrees: English is highly ambiguous, deliberately so, to enable poetic language; while mathematics strives structurally to eliminate ambiguity as much as possible, but there's still some tough nuts like √2, √-1, e, i, π, etc, that defy easy comprehension) but intelligence is also ambiguous!

This was my whole point with the supercilious ranting about Descartes in my OP. This solipsistic, mechanistic 'magical thinking' about intelligence, that fundamentally derives from the meaningless tautology of 'I think therefore I am', is a complete philosophical dead-end, and it will only cause AI developers more frustration if they stick with it, in my opinion.

They are, if you will, putting Descartes before Des Horses; obsessing over the mysteries of 'internal intelligence' inside the brain, and entirely forgetting about the mountains and mountains of socially-generated, culturally-encoded stories and lessons that live all around us, outside of our brains, our 'external intelligence', 'extelligence', if you like.

That 'extelligence' is what AI is actually modelling itself off, not our 'internal intelligence'. That's why LLMs seem to have all these enigmatic-seeming 'emergent properties', I think.

1