Submitted by seanrescs t3_10p4lhq in MachineLearning

I am a researcher at a US university and have a budget of 25k to build a PC for training various ML algorithms (e.g. DRL, neuromorphic computing, VAE, etc). I'm trying to decide between going for prebuilds (like https://lambdalabs.com/gpu-workstations/vector) or building with consumer cards like 4090s.

Any advice on which is the most bang for the price? Im not sure how much Im giving up by going for consumer 24g cards vs a6000, 6000 ada but prebuild prices go up quick. Warrantee vs building it myself isn't an issue

2

Comments

You must log in or register to comment.

synth_mania t1_j6i8qkk wrote

Well you are sacrificing gpu virtualization afaik. Only enterprise cards get native support for that feature without hacks that may or may not work.

6

jiamengial t1_j6iiux3 wrote

Where do you plan to put the machine? If it's anywhere near where you (or anyone else) work I'd recommend getting it liquid cooled if you want to save your hearing.

The A6000s don't have active cooling on themselves and are definitely meant to last a whole lot longer than the 4090's, so will be better if you plan to use the machine for quite a while or want to retain resell value for the future

1

seanrescs OP t1_j6j2fef wrote

It can be stored in an active lab environment or away if noise is an issue, its more about which will give more utility for the longest time. It seems a6000 is the better choice from Tim Dettmers, will probably go with that one if I can get one quoted at a good price

2

Aggressive_Bass2755 t1_j6n6y33 wrote

I think the best thing for you is find investors for your project. Some like Angel investors or open a go find me You need definitely more than 25k Cause you don't want to get stuck halfway with either out of money or a minor quality result.

1

deepstatefarm t1_j6oy6ps wrote

for half, $12k, (and get two) I would go with something AMD with PCIe 5.0 and 4x GPU slots. And I would personally get used 3090, but if you want warantee new 4090. I haven't RMA anything for a long time but be warned could take over a year to get a replacement card. 4090 might not support the new up in coming 4bit LLM mode. Not sure.

Server rack, with 2.5gigE ethernet, 1-to-1 VRAM to RAM (with 20% more CPU ram), NVM 2tb or better, and 40-60TB storage.

2