Submitted by jnfinity t3_11e3fbj in deeplearning

I am currently building a new workstation for myself and I am wondering if it makes more sense to use dual 3090 (or 3090 Ti) with NVLink and make use of the extra VRAM that way or instead get a single 4090 (since they sadly don't support NVLink anymore).

I mostly use it for smaller experiments with Pytorch / Pytorch Lightning before moving to our training cluster.

1

Comments

You must log in or register to comment.

CKtalon t1_jacill4 wrote

Since you are working on smaller experiments, single 4090. NVLink is overhyped

1

jnfinity OP t1_jacno4o wrote

I do run into VRAM constrains from time to time though... but I was thinking an A6000 Ada was a bit overkill.

1

ZaZaMood t1_jacingn wrote

Where you going to find two 3090 cards brand new?

Here is my post with some answers:

https://www.reddit.com/r/MachineLearning/comments/xiwc12/buy_rtx_3090x2_or_single_4090_d/

I just couldn't find two 3090 brand new for the price of one 4090

1

lukaszpi t1_jae8dlx wrote

Why exactly do they need to be brand new? I've heard about those used for mining but other than that anything else? Thx

1