Viewing a single comment thread. View all comments

Maxerature OP t1_j5ihite wrote

Is it possible to disable the secondary GPU when not performing ML tasks so that it doesn't interfere with other tasks?

1

FastestLearner t1_j5iklgu wrote

If you don't engage the second GPU, it will remain dormant, and should not automatically interfere with anything. For example if you are training a network in PyTorch without using DP or DDP, then it will use the first GPU by default. You can always change which GPU it uses using the environment variable CUDA_VISIBLE_DEVICES. Also, make sure the primary GPU occupies the first PCIe slot. You could verify this with nvidia_smi. When you have the display hooked up to it, the primary GPU will have a slightly higher memory usage (~100 MB) because of display server processes like Xorg, than all other GPUs.

2