Submitted by Nerveregenerator t3_z0msvy in deeplearning
Ok, so im considering upgrading my deep learning PC. Im currently using a 1080Ti. From my perspective, it is still a relatively solid card, and can be picked up on ebay for 200 bucks. So my question is, would I be better off using 4 1080Ti's or 1 3090? These should be reasonably similar in price. Also, im aware I will need a cpu that can handle this, so I suppose if you guys have any suggestions on a motherboard and cpu that can keep 4 1080s full of tensors, that would be helpful too. I cant seem to find a straight answer on why this setup isn't more popular, because the cost/performance ratio for the 1080's seems great..
Thanks
​
​
EDIT
- so sounds like a 3090 will be the best move to avoid complexities associated with multiple GPUs. What do you guys think if there was a pip package that allowed you to benchmark your setup for deep learning and then you could compare results to other users? Would that be something you would be interested in?
Star-Bandit t1_ix6l9wf wrote
You might also check some old server stuff, I have a Dell R720 running two Tesla K80's which is essentially the equivalent of 2 1080s per card. While it may not be the latest and greatest, the server ran me $300 and the two cards ran me $160 from eBay.