Viewing a single comment thread. View all comments

ShowerGrapes t1_jeaaln0 wrote

>we are just getting more and more efficient by the day meaning anyone will be able able to run GPT-n perf on their hardware soon.

yes, anyone will be able to train neural networks but not the kind to make simps like musk tremble with fear. open ai spent 7 million on cloud computing costs alone to train gpt. it would be a trivial (and misguided) task to shut down future ai development.


Sure_Cicada_4459 t1_jeace5e wrote

Actually no, And this is still an underestimate because predicting 10 years in algorithmic advances in the field of AI is silly. And that doesn't even account for distillation, more publicly available datasets and models, multi-LLM systems,... There are so many dimensions in which this train is running, it makes you dizzy thinking abt it and makes regulation look like nothing more then pure cope.


ShowerGrapes t1_jeakin8 wrote

it's a suggested pause of 6 months not 7 years. plus, i agree the pause a dumb idea.