Submitted by xyrlor t3_ymoqah in deeplearning
fjodpod t1_iv52eke wrote
Is it possible?
Yes, but you probably need newer Linux distros and some basic Linux knowledge.
Do I do it myself?
Yes in pytorch with a 6600, but it was a bit annoying to set up with some errors, however now it just works (haven't benchmarks it yet).
Do I recommend it for the average user?
No, you should only do it if you suddenly want to do machine learning but you're stuck with an amd card.
If you haven't bought a gpu yet and you consider doing machine learning avoid the setup hassle and just pay a bit more for Nvidia gpus. 3060 12GB is a good value graphics card for machine learning
xyrlor OP t1_iv5axmk wrote
Thanks! I’m currently running a 3070, but have some deep learning unrelated errors so I’m looking around for options while I send my card in for repairs. Since new gpus are announced from both Nvidia and AMD, I was curious about the perspective on both gaming and deep learning for side projects.
fjodpod t1_ivk6pbg wrote
Personally i would either wait for the 4000 series midtier cards or just buy a 3000 series card with enough vRAM. However keep in mind that the 4000 Series theoretically could be worse or on par for machine learning than the 3000 series in some cases due to lower memory throughput: (https://www.reddit.com/r/MachineLearning/comments/xjt129/comment/ipb6p8y/?utm_source=share&utm_medium=web2x&context=3)
xyrlor OP t1_ivlicka wrote
That's what I'm currently considering too. But I'm not optimistic about mid tier card prices, considering how the 4080 and 4090 are priced.
Viewing a single comment thread. View all comments