Viewing a single comment thread. View all comments

dhruvdh t1_ispgllz wrote

Reply to comment by Moppmopp in rx6900xt for ML? [D] by Moppmopp

It is potentially enough. But most material on the internet assumes you a CUDA device, so as a novice it would make sense to take the path of least resistance.

If you do not have an option, look into https://www.amd.com/en/graphics/servers-solutions-rocm-ml. It won't explicitly say your card is supported but it should run fine.

ROCm ML is supported only on linux, as far as I know.

1

Moppmopp OP t1_ispheog wrote

how about an rtx3080 as an alternative. Would you say that would be the overall better choice? I am hesitant because its 50€ more expensive while having 6gb less vram and worse rasterization performance

1

dhruvdh t1_ispjncc wrote

Have you considered not buying a GPU at all and making using of paid services from Google Colab, lambdacloud, etc.

You can use these while you learn, and learn more about your requirements, and make a more educated decision later.

Colab free tier works great for short experiments, and next up tier is just 10$ a month.

AMD is also set to announce new GPUs on November 3, depending on their price all last gen prices should go down.

1