Submitted by boosandy t3_zgn046 in deeplearning
I bought an RTX 2060 12 gigs vram for my DL projects. But my desktop already has a GTX980. Now if I connect my 2060 along with the gtx 980, and connect my display to the 980 , will pytorch be use the whole vram of 2060 ?
Is this even a valid set up? Please help.
Volhn t1_izhrkd8 wrote
Yes. You might have to specify which device to use though. You also can’t combine memory into one pool, but you can parallelize.