Submitted by zveroboy152 t3_102n6qp in MachineLearning

Greetings!

​

In my adventures of Pytorch, and supporting ML workloads in my day to day job, I wanted to continue homelabbing and buildout a compute node to run ML benchmarks and jobs on.

​

This brought me to the AMD MI25, and for $100 USD it was surprising what amount of horsepower, and vRAM you could get for the price. Hopefully my write up will help someone in the machine learning community.

​

Let me know if you have any questions or need any help with a GPU compute setup. I'd be happy to assist!

​

https://www.zb-c.tech/2022/11/20/amd-instinct-mi25-machine-learning-setup-on-the-cheap/

36

Comments

You must log in or register to comment.

gradientpenalty t1_j2u9byl wrote

Do you have any benchmarks to share? Would be very nice if this is available

3

zveroboy152 OP t1_j2uuiy1 wrote

Those are coming soon. I'm working on collecting a few sub $100 GPU's and running them through a suite of benchmarks from PyTorch's Repo:

https://github.com/pytorch/benchmark

​

I'll be sure to follow up and post some numbers. :-)

8

currentscurrents t1_j2uul25 wrote

Interesting how that card went from $15k to $100 in the space of five years.

I'm holding out hope the A100 will do the same once it's a couple generations old.

3

5death2moderation t1_j2wrwrl wrote

Tesla m40s and now p100s were 200 dollars a piece just four years after release. V100s have not depreciated as quickly though, presumably because the tensor cores keep their performance competitive. I would assume a100s will suffer the same fate of being very expensive for many years to come sadly.

6

CrashTimeV t1_j3zt9sk wrote

Are you the person from craft computing’s discord who is running stable diffusion on his MI25?

2

zveroboy152 OP t1_j41idsx wrote

I am not, but that sounds like a very cool thing to run on it. :-) (I'm a big fan of craft computing)

1

CrashTimeV t1_j41im7e wrote

You should look into P100s in the future. They are coming down in price pretty fast I think you can get one for around 200$ US

2

zveroboy152 OP t1_j41q48e wrote

I ordered one three days ago for $170. ;-) I hope to be doing some testing and write ups on it soon.

1

CrashTimeV t1_j41qigp wrote

Let me know if you figure out gpu direct storage. I just got 2x P100s and a R730 for my ML rig later found out my ssds were not the correct ones so waiting for the new ones to arrive. Can’t wait to integrate this into my lab and workflow

1

zveroboy152 OP t1_j472zg8 wrote

That sounds like a pretty sick machine! I'll check out GPU Direct Storage and see if I can get it working. :-)

1

SnooHesitations8849 t1_j2uw5u7 wrote

If AMD was good at providing a good driver this would be a game changer for beginner. LoL.

1

currentscurrents t1_j2uxtmv wrote

Yeah... it's no A100, but it's on par with the high-end gamer cards of today. For much less money.

4

zveroboy152 OP t1_j2v9jj8 wrote

Agreed! 16GB of HBM memory is impressive for the price. :-)

6

fakesoicansayshit t1_j3dd43i wrote

Would this run on windows?

1

Iwishtoeatfish t1_j3rcglx wrote

It can. You would need to flash the vbios to a wx9100. It would only have a single mini-displayport out though

2