Submitted by GhostingProtocol t3_11k59br in deeplearning

Hi r/deeplearning. I'm very interested in AI and computer vision and I'm building a workstation so I can start doing my own projects. My problem is VRAM. I wanted to buy an AMD card because it's more bang for your bux and you get 16GB of vram. But found out that AMD's software stack sucks so Nvidia seems like the way to go (almost required apparently).

3070ti (8gb vram) was my choice before I found out how important vram was, so 3060ti/3070/3070ti is out of the question. 4000-series seems very overpriced, and anything older than 3000-series seems outdated. That leaves me with 3060 (12gb/3584cc), 3080 (10gb/8704cc). Rtx3080ti+ fall outside my budget.

My thoughts here is 3080 has twice as many cuda cores, but only 10gb ram (Very hard to find 12gb version in my country). The 3060 has more ram, is a lot cheaper. But again, only have half the cuda cores of the 3080.

The obvious answer to me here seem to be to buy the 3080 now, and if vram ever end up being a limitation, I can buy a riser, a 3060, and have a combined 22GB of memory while running the 3060 externally. Does this seem like a good way of solving this problem?

9

Comments

You must log in or register to comment.

LetMeGuessYourAlts t1_jb88bdv wrote

Consider used if you really want to maximize what you can do on a budget. It's very likely going to give you identical performance to new and you can get a used 3090 for the cost of a new 3070 ti and open so many doors for what you can do memory-wise.

3

GhostingProtocol OP t1_jb88x11 wrote

I’d buy used anyways, kinda a hot take but I refuse to give NVIDIA money :P

I’m thinking of going with 3090 for 900$ or 3080 for 650$ (I can get FE for 750$ which would be pretty epic)

Got any advice? I don’t like that the 3080 only has 10GB vram. But 3080 is already pretty much overkill for anything I’d use it for other than deep learning. Kinda on the fence here tbh

1

LetMeGuessYourAlts t1_jbbw4d4 wrote

I just bought a used 3090 for $740 on eBay before tax. I view my GPU as a for-fun expenditure. Part of that is ML stuff. For the cost of a handful of new release videogames, you can go from 10gb to 24gb and do a lot of cool stuff. There's going to be increasingly less state of the art stuff that fits in 10gb comfortably.

2

suflaj t1_jb91xiw wrote

The 3060s are pretty gimped for DL. Currently, you either go for a 3090/4090, or used 1080Ti/2080Ti(s) if on a budget.

1

GhostingProtocol OP t1_jb928dg wrote

Yeah, I found out about gpu bus width a few hours ago. At least I’be learned a lot about GPUs haha

Buying a 3090 when my current 6gb laptop gpu stops working for my purposes.

1

suflaj t1_jb92dey wrote

Well to be honest, unless there's some particular reason why you need the GPUs locally, the most cost effective solution is to just run it in the cloud.

Having GPUs locally is mostly a luxury for when some contract prevents you from using the cloud, or you need to train something every day for several hours over a year or more. For everything else, cloud pay-as-you-go will be cheaper and faster.

1