Submitted by GPUaccelerated t3_yf3vtt in MachineLearning
PassionatePossum t1_iu3k7ga wrote
For me, speed is only important when it comes to inference. And for the problems I work on (medical imaging) I don't scale the hardware to the model, I scale the model to the hardware.
I don't train models that often and when I do I don't really care if it takes 2 days or 2,5 days. Of course, if you are working on really large-scale problems or need to search through a huge parameter space you will care about speed but I doubt the average guy is that interested. During training the most important resource for me is VRAM.
GPUaccelerated OP t1_iu4xju0 wrote
This is what I'm seeing the most. Which makes so much sense for your use case.
Thank you for sharing!
Viewing a single comment thread. View all comments