Submitted by rretaemer1 t3_10yz6uq in Futurology
r2k-in-the-vortex t1_j80pk9o wrote
If you are thinking large language models likes of ChatGPT, then sorry, that's not going to happen in open source any time soon. Not only is training cost prohibitive, but also consumer hardware is nowhere near able to run those. They are just plain too large.
Be happy that stable diffusion was released for free. Training that cost 600k by the way.
Viewing a single comment thread. View all comments