Submitted by Numerous_Talk7940 t3_11w9hkj in deeplearning

Hi there,

I want to upgrade my GPU since I get continuously more involved into deep learning and training model every day. The two choices for me are the 4080 and 4090 and I wonder how noticeable the differences between both cards actually are. That is, will the Training be 2x faster or just 1.2? What actually is the benefit of investing more money, it my budget is not capped.

16

Comments

You must log in or register to comment.

wally1002 t1_jcx6232 wrote

For deeplearning higher VRAM is always preferable. 12/16GB limits the kind of models you can run/infer. With LLMs getting democratised it's better to be future proof.

12

RichardBJ1 t1_jcxcu7e wrote

Probably need an answer from someone who has both and has benchmarked some examples. (EDIT: and I do not!) Personally I find a lot of “law of diminishing(Edit) returns” with this type of thing. Also for me, since I spend 100x more time coding and testing will dummy sets… the actual speed of run is not as critical as people would expect…

1

mrcet007 t1_jcxgyl7 wrote

12/16gb is already hitting limit of what's available on market for consumer gaming GPU. Only GPU for deeplearning with more than 16bgb is 4090 which is already out of range for most individual at $1500

−4

MisterManuscript t1_jcxo0zv wrote

You don't need the 40 series, they're designed with providing ML solutions to games. You're paying extra just to have an in-built optical-flow accelerator that you're not gonna use for model training.

The optical flow accelerator is meant for computing dense optical flow fields as part of many inputs to the DLSS feature that most new games use.

You're better off with the 30 series or lesser.

17

chatterbox272 t1_jcy2h7v wrote

The speed is not the main difference you're going to notice, it's the VRAM. VRAM is a hard limit to work around, so it simply depends whether you need the extra.

3

Numerous_Talk7940 OP t1_jd6ksav wrote

I am not confident comparing these charts perfectly, it 'looks like' the 4090 is superior to the 3090 (3090 because it has the same VRAM, which is an important factor as I have learned), but by how much? Does the price justify the advantages of the 4090?

0

GrandDemand t1_jd96065 wrote

No. I'd get a used 3090. Save the rest of your money for when you have more experience and a better grasp of the kinds of problems you'd like to solve and the hardware needed to run the corresponding models. Then you'll realize either that your hardware is sufficient as is (with a 3090), a 4090 would actually benefit you, a card with 48GB of VRAM is essential (ie. You need either an Ada Titan if it comes out or 2x 3090s), or it's way too expensive to run on consumer hardware and just use cloud GPU instances with A100s or H100s instead. But the 3090 will be a great card for now, and a used one in great condition (sometimes even open box or with an active warranty) can be found easily on the hardwareswap subreddit for $800 or even less.

0

GrandDemand t1_jd9w0n4 wrote

Yes that's likely better actually since its a much newer card (less likely to have been mined on) just make sure you look into tuning it for efficiency as the card is designed to run at 450W (which is insane). I can't direct you to any guides for the 3090Ti specifically but I'd just google 3090Ti undervolt. The 3090 should probably also be undervolted too, you really don't need these cards to be hitting their power limits of 450W and 350W respectively, tuning them to a more reasonable 325-350W and 280-300W makes way more sense.

0

GrandDemand t1_jdbi5kd wrote

I'd say in that case just go new 4090 unless you can get a used 3090 for half that or a used 3090Ti for a little over half. I'm surprised that the difference is that small, I guess I'm accustomed to the used market in the US which I imagine is quite a bit larger

1

Numerous_Talk7940 OP t1_jdbifz9 wrote

I see, thanks for your help! The 4090 is from palit though while the used 3090/3090ti are from MSI or Gigabyte, which are more expensive in general. For my use case however, the brand should not matter. Does the advice for undervolting still hold for the 4090?

1

GrandDemand t1_jdbjfmw wrote

Palit is a good brand from what I've heard! And yeah I'm not as familiar with it but googling the same thing regarding power limiting/undervolting the 4090 will bring up similar results about optimizing efficiency for it. You can likely get it not exceed 350W under load and likely even get a performance boost, not even regression. The 4090 is incredibly efficient given its performance, it also draws way less power when idling than the 3090 or 3090Ti. Had no idea the price gap was that small haha otherwise I would've recommended the 4090 straight away especially given the price in energy increase you've more than likely experienced

1