Thebadwolf47
Thebadwolf47 t1_iwwn0g3 wrote
Reply to comment by PhelesDragon in Meta has withdrawn its Galactica AI, only 3 days after its release, following intense criticism. Meta’s misstep—and its hubris—show once again that Big Tech has a blind spot about the severe limitations of large language models in AI. by lughnasadh
intuition and inherited memory is training data, from all your ancestors that persisted to you through their DNA dictating the basic formation of your brain. just like some animals can walk or eat right after being born. it's not that they haven't had training data, it's just that this training data has been coded in their DNA
Thebadwolf47 t1_jdnbfya wrote
Reply to comment by Short_Change in [D] Do we really need 100B+ parameters in a large language model? by Vegetable-Skill-9700
wasn't he rather comparing the parameters to the volume of the first computer and not their transistor count?