Submitted by xutw21 t3_ybzh5j in singularity
billbot77 t1_itkvil1 wrote
Reply to comment by gibs in Large Language Models Can Self-Improve by xutw21
On the other hand, language is at the foundation of how we think.
gibs t1_itkwtf8 wrote
So people who lack language cannot think?
blueSGL t1_itlmagr wrote
thinking about [thing] necessitates being able to form a representation/abstraction of [thing], language is a formalization of that which allows for communication. It's perfectly possible to think without a language being attached but more than likely having a language allows for easier thinking.
GeneralZain t1_itl04mo wrote
who lacks language?
Haile_Selassie- t1_itlyxk9 wrote
Read about feral children
billbot77 t1_itmvzxs wrote
This is exactly what I meant. Feral kids lacking in language had limited ability to think and reason in abstracted terms. Conversely, kids raised bilingual have higher cognitive skills.
Also, pattern recognition is the basis of intelligence.
Whether "sentience" is an emergent property is a matter for the philosophers - but starting with Descartes (I think therefore I am) as the basis of identity doesn't necessarily require any additional magic sauce for consciousness
BinyaminDelta t1_itoh4ta wrote
Allegedly many people do not have an inner monologue.
I say allegedly because I can't fathom this, but it's apparently true.
gibs t1_itpnbia wrote
I don't have one. I can't fathom what it would be like to have a constant narration of your life inside your own head. What a trip LOL.
kaityl3 t1_itsym7e wrote
It would be horrible to have it going constantly. I narrate to myself when I'm essentially "idle", but if I'm actually trying to do something or focus, it shuts off thankfully.
gibs t1_itnozbf wrote
People with aphasia / damaged language centres. Of course that doesn't preclude the possibility of there being some foundational language of thought that doesn't rely on the known structures that are used for (spoken/written) language. Although we haven't unearthed evidence of such in the history of scientific enquiry and the chances of this being the case seems vanishingly unlikely.
ExpendableAnomaly t1_itldwg3 wrote
No, but it gives us a higher level of thought
[deleted] t1_itnj5t2 wrote
[deleted]
kaityl3 t1_itsyvfr wrote
Yeah, I truly believe that the fact these models can parse and respond in human language is so downplayed. It takes so much intelligence and complexity under the surface to understand. But I guess that because we (partially) know how these models decide what to say, everyone simplifies it as some basic probabilistic process... even though for all we know, we humans are doing a biological version of the same exact thing when we decide what to say.
Viewing a single comment thread. View all comments