Viewing a single comment thread. View all comments

harharveryfunny t1_jca7x9f wrote

Yes - the Transformer is proof by demonstration that you don't need a language-specific architecture to learn language, and also that you can learn language via prediction feedback, which it highly likely how our brain does it too.

Chomsky is still he sticking to his innateness opinion though (with Gary Marcus cheering him on). Perhaps Chomsky will now claim that Broca's area is a Transformer?

4

Alimbiquated t1_jcbspbs wrote

This kind of model needs vastly more input data than the human brain does to learn. It doesn't make sense to compare the two.

For example, Chat GPT is trained on 570 GB of data comprising 300 billion words.

https://analyticsindiamag.com/behind-chatgpts-wisdom-300-bn-words-570-gb-data/

If a baby heard one word a second, it would take nearly 10,000 years to learn the way Chat GPT did. But babies only need a few years and hear words at a much lower average rate.

So these models don't undermine the claim of innateness at all.

7

harharveryfunny t1_jcchnkp wrote

That's a bogus comparison, for a number of reasons such as:

  1. These models are learning vastly more than language alone

  2. These models are learning in an extraordinarily difficult way with *only* "predict next word" feedback and nothing else

  3. Humans learn in a much more efficient, targetted, way via curiosity-driven knowledge gap filling

  4. Humans learn via all sorts of modalities in addition to language. Having already learnt a concept then we only need to be given a name for it once for it to stick

6

Necessary-Meringue-1 t1_jcm5mye wrote

>These models are learning vastly more than language alone

A child growing up does too.

>These models are learning in an extraordinarily difficult way with *only* "predict next word" feedback and nothing else

Literally the point, that LLMs do not learn language like humans at all. Unless you're trying to say that you and I are pure Skinner-type behavioralist learners.

1

Alimbiquated t1_jcd2z4g wrote

I agree that comparing these learning processes to brains is bogus.

There is a general tendency to assume that if something seems intelligent, it must be like a human brain. It's like assuming that because it's fast, a car must have legs like a horse and eat oats.

0

Necessary-Meringue-1 t1_jcm6j79 wrote

>There is a general tendency to assume that if something seems intelligent, it must be like a human brain. It's like assuming that because it's fast, a car must have legs like a horse and eat oats.

Ironic, because that is literally what that article is doing.

1

Alimbiquated t1_jcmi1fd wrote

Right, it makes no sense.

1

Necessary-Meringue-1 t1_jcmjqhm wrote

I don't understand why it's so hard for people to acknowledge that LLMs deliver extremely impressive results, but that does not mean they have human-like intelligence of language understanding.

1

currentscurrents t1_jcdsf9u wrote

The brain doesn't have any built-in knowledge about language, but it has an advantage; it's trying to communicate with other brains.

It is fundamentally impossible to understand human language without understanding how humans think. Language isn't a structured formal thing, it's more like the fuzzy interactions of two neural networks.

Humans already know how other humans think - plus they have a shared world environment to ground the symbols in. LLMs have to learn to approximate both of those.

2

sam__izdat t1_jcet79g wrote

> Language isn't a structured formal thing

[citation needed]

2

currentscurrents t1_jcfu9l8 wrote

That's why it's a natural language instead of a formal language.

2

WikiSummarizerBot t1_jcfub6d wrote

Natural language

>In neuropsychology, linguistics, and philosophy of language, a natural language or ordinary language is any language that has evolved naturally in humans through use and repetition without conscious planning or premeditation. Natural languages can take different forms, such as speech or signing. They are distinguished from constructed and formal languages such as those used to program computers or to study logic.

Formal language

>In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules. The alphabet of a formal language consists of symbols, letters, or tokens that concatenate into strings of the language. Each string concatenated from symbols of this alphabet is called a word, and the words that belong to a particular formal language are sometimes called well-formed words or well-formed formulas. A formal language is often defined by means of a formal grammar such as a regular grammar or context-free grammar, which consists of its formation rules.

^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)

1

sam__izdat t1_jch1c32 wrote

I'm familiar with the terms, but saying e.g. "imaginary numbers don't exist because they're called imaginary" is not making a meaningful statement. All you've said is that German is not C++, and we have a funny name for that. And that's definitely one of the fuzzier interactions you can have about this, but I'm not sure how it proves that natural languages (apparently? if I'm reading this right...) lack structure.

1

currentscurrents t1_jch3nic wrote

So why do you think it is a structured formal thing?

1

sam__izdat t1_jch4kn0 wrote

It is a "structured thing" because it has concrete definable grammatical rules, shared across essentially every language and dialect, and common features, like an infinite range of expression and recursion. If language didn't have syntactic structure we'd just be yelling signals at each other, instead of doing what we're doing now. There would be nothing for GPT to capture.

1

currentscurrents t1_jch9ulc wrote

Oh, it is clearly structured. Words and phrases and sentences are all forms of structure and we're using them right now.

What it doesn't have is formal structure; it cannot be fully defined by any set of rules. This is why you can't build a rules-based parser that understands english and have to use an 800GB language model instead.

>shared across essentially every language and dialect

Noam Chomsky thinks this, but the idea of a universal grammar is controversial in modern linguistics.

1

sam__izdat t1_jchg8nd wrote

I'll leave it to the linguists to debate UG and the specifics of what it does and doesn't mean, but commonalities like some sort of hierarchy, recursion, structure-dependence of rules, etc clearly exist, whatever you want to call them. By shared I just mean there's specific things that human cognitive faculties are set up to do and then other (often computationally simpler) things they clearly don't do. But again, if you're just saying natural languages are not formal languages, I guess that's true by definition. It just sounded to me like you were implying something different.

1

Necessary-Meringue-1 t1_jcm5x7g wrote

just because it's "natural" does not mean it's unstructured or does not have any logic, can you be any more disingenuous than to rely some etymology-based semantics?

Like programmers invented structure

0

Necessary-Meringue-1 t1_jcm4o9d wrote

>the Transformer is proof by demonstration that you don't need a language-specific architecture to learn language, and also that you can learn language via prediction feedback, which it highly likely how our brain does it too.

where to even start, how about this:

The fact that a transformer can appear to learn language on a non-specific architecture does not at all mean that humans work the same way.

​

Did you ingest billions of tokens of English growing up? How did you manage to have decent proficiency at the age of 6? Did you read the entire common crawl corpus by age 10?

​

This kind of argument is on paper stilts. LLMs are extremely impressive, but that does not mean they tell you much about how humans do language.

1