Is my plan for a 4x RTX 3090 Machine Learning rig feasible? Submitted by [deleted] t3_118pjv9 on February 22, 2023 at 5:05 AM in deeplearning 7 comments 1
suflaj t1_j9iwxuj wrote on February 22, 2023 at 8:06 AM You would have to limit the power to 250W. It will overheat without an open case. PCI-E 3 x8 means you are cutting the cards bandwidth in half. Overall a terrible idea. Permalink 1
Viewing a single comment thread. View all comments