harrier_gr7_ftw

harrier_gr7_ftw t1_j37iwr5 wrote

He went FR after the first paragraph but the algorithms are well known.... it just gets expensive because you need to buy the data for training on, and like you say, the computing time.

Everyone is/was surprised that transformers give better results the more data you give them but this is literally OpenAI's raison d'etre; make the next better GPT by feeding in more data. Sadly most of us can't afford a 1000TB RAID setup to store the Common Crawl and tons of scanned books on, as well as a load of A100 Nvidia TPUs. :-(

AGI is another thing of course which will need a lot more research.

2