Viewing a single comment thread. View all comments

CeFurkan t1_j2jri17 wrote

make no mistake. those ram would not be beneficial for you. almost all ai algorithms running on gpu. otherwise just too slow.

for example rtx 3060 is 22 times faster than my core i7 10700 f

How Good is RTX 3060 for ML AI Deep Learning Tasks and Comparison With GTX 1050 Ti and i7 10700F CPU

16

iNstein t1_j2l04ge wrote

Maybe op should look at multiple 4080s running together, something like used when mining crypto? I know you can get special motherboards that are designed to allow you to have multiple cards connected, just need a honking great power supply.

3

MrEloi OP t1_j2wlzzz wrote

Good idea.

I have checked the pricing of 12gb & 24GB GPUs ... not too bad currently.

I suspect that say 5 x 24GB GPUs would come to under $10k.

Add in the special motherboard and maybe we are looking at around $15k for a complete hardware system.

With careful purchasing that could possibly come down to under $10k total.

And, yes, GPT can be run across multiple GPUS, both when training and for running.

1