naum547 t1_je7ml6j wrote

LLMs are trained exclusively on text, so they excel at language, basically they have an amazing model of human languages and know how to use them, what they lack for example is a model of the earth, so they fail at using latitude etc. same for math, the only reason they would know 2 + 2 = 4 is because they read enough times that 2 + 2 = 4, but they have no concept of it. If they would be trained on something like 3d objects they would understand that 2 things + 2 things make 4 things.