Viewing a single comment thread. View all comments

Moppmopp OP t1_isontnp wrote

Reply to comment by ZestyData in rx6900xt for ML? [D] by Moppmopp

Thank you for your detailed answer. So to make it short gpu performance and Vram doesnt matter at all if and only if the gpu doesnt have dedicated cuda cores? Or in other words its nearly impossible to run ML stuff on amd cards?

2

Blasket_Basket t1_isp0d5p wrote

Yep, pretty much. AMD cards are pretty close to useless when it comes to Deep Learning. Shallow algorithms (anything this is ML but not DL) typically run on the CPU, not the GPU.

For DL, you need Nvidia cards.

1

dhruvdh t1_ispgx2z wrote

Please don't create misinformation by stating your opinion as fact.

0

dhruvdh t1_ispgllz wrote

It is potentially enough. But most material on the internet assumes you a CUDA device, so as a novice it would make sense to take the path of least resistance.

If you do not have an option, look into https://www.amd.com/en/graphics/servers-solutions-rocm-ml. It won't explicitly say your card is supported but it should run fine.

ROCm ML is supported only on linux, as far as I know.

1

Moppmopp OP t1_ispheog wrote

how about an rtx3080 as an alternative. Would you say that would be the overall better choice? I am hesitant because its 50€ more expensive while having 6gb less vram and worse rasterization performance

1

dhruvdh t1_ispjncc wrote

Have you considered not buying a GPU at all and making using of paid services from Google Colab, lambdacloud, etc.

You can use these while you learn, and learn more about your requirements, and make a more educated decision later.

Colab free tier works great for short experiments, and next up tier is just 10$ a month.

AMD is also set to announce new GPUs on November 3, depending on their price all last gen prices should go down.

1