Viewing a single comment thread. View all comments

ecnecn t1_j6olpie wrote

350 GB of VRAM needed for Chat GPT 3.5

so you need at least 15x 3090 TI with 24 GB VRAM... then you need 10.000 Watt to host it... but it actually uses $ 5000 to 32.000 per card units in the google cloud so it would be at least $15.000 with "cheap cards" like 3090 Ti and $ 200.000 to run it on adequate GPUs like A100. You need at least 5 A100 with 80 GB just to load Chat GPT 3.5. ChatGPT was trained on average of 10k google cloud connected GPUs. If you have the basic ca $ 200k setup (for the cheap setup) or 500k (the rich one) and hugh energy bills are no problems then you need to invest in the google cloud to further train it the way you want.

With that setup you would make less loss if you become a late crypto miner...

Edit: You really can afford to build that? 15x A100 Nividia cards cost like 480k

5

alexiuss t1_j6on7ty wrote

My partner is a tech developer so she could probably afford such a setup for one of her startup companies. Making our own LLM is inevitable since openai is just cranking up censorship on theirs with no end in sight and reducing functionality.

Main issue isn't video card cost, it's getting the source code and a trained base model to work with. Openai isn't gonna give up theirs to anyone, so we're pretty much waiting for stability to release their version and see how many video cards it will need.

1

ecnecn t1_j6onkuj wrote

Would be a great thing if your partner could do that kind of hugh investment.

1

gay_manta_ray t1_j6p8rdp wrote

what will those figures look like in five years? FLOP/s per dollar doubles roughly every 1.5 years.

1