Submitted by Numerous_Talk7940 t3_11w9hkj in deeplearning
Hi there,
I want to upgrade my GPU since I get continuously more involved into deep learning and training model every day. The two choices for me are the 4080 and 4090 and I wonder how noticeable the differences between both cards actually are. That is, will the Training be 2x faster or just 1.2? What actually is the benefit of investing more money, it my budget is not capped.
wally1002 t1_jcx6232 wrote
For deeplearning higher VRAM is always preferable. 12/16GB limits the kind of models you can run/infer. With LLMs getting democratised it's better to be future proof.