Submitted by jnfinity t3_11e3fbj in deeplearning

I am currently building a new workstation for myself and I am wondering if it makes more sense to use dual 3090 (or 3090 Ti) with NVLink and make use of the extra VRAM that way or instead get a single 4090 (since they sadly don't support NVLink anymore).

I mostly use it for smaller experiments with Pytorch / Pytorch Lightning before moving to our training cluster.

1

Comments

You must log in or register to comment.

CKtalon t1_jacill4 wrote

Since you are working on smaller experiments, single 4090. NVLink is overhyped

1