Viewing a single comment thread. View all comments

DevarshTare t1_j95ct2p wrote

What matters while running models?

hey guys, I'm new to machine learning and just learning from the basics. I am planning to buy a GPU soon for running pre-built models from google colab.

My question is after you build a model what matters for the models runtime? Is it the Memory, the bandwidth or the cuda core you utilize?

Basically what makes an already trained model run faster when using in application? I can imagine it may vary from application to application, but just wanted to learn what matters the most when running pre trained models?

1

nikola-b t1_j9crv4u wrote

I would more memory is more important. Buy the 3060 with the 12GB. If you have more money get the 3090 24GB. The memory is more important in my view because it will allow you to run bigger models.

1