jcinterrante
jcinterrante t1_ix8wpra wrote
I feel like Fortran should really be at the top of this, because it’s the common ancestor of so many of these
jcinterrante t1_iqx6eii wrote
Reply to comment by e3928a3bc in The returns to learning the most common words, by language [OC] by orgtre
This is probably why Hebrew scores so poorly on this metric. Many of these kinds of articles and conjunctions are added to the word with a prefix character. Like:
אבא - father
האבא - the father
But it’s not as if it’s any harder to spot the hebrew -ה than it is to spot the english “the” just because it’s a prefix rather than its own word…
jcinterrante t1_j7guik1 wrote
Reply to Does the high dimensionality of AI systems that model the real world tell us something about the abstract space of ideas? [D] by Frumpagumpus
Check out the UChicago Knowledge Lab. This sounds generally related to what James Evans is working on. His work is more narrowly targeted than what you’re taking about because its focused on the generation of ideas in academic settings. But its still a good starting place for you.
It also sounds like it could be related to some of the work coming out of the Santa Fe Institute. But I don’t have any specific papers in mind.