Submitted by [deleted] t3_11g2dio in deeplearning

I am a junior dev using Google Colab for training small DL models because it is a lot faster than my Lenovo Ideapad 2015.

And I have rather quickly ran out of computing units (even though I terminate runtimes after calc.)

Are there sites/platforms or just hardware you'd recommend to train simple Pytorch or TF, scipy or plain numpy DL models?

15

Comments

You must log in or register to comment.

Boonzies t1_jamckvc wrote

Bottom line, in this field you will have to pay. Equipment, the tech, matters.

For most models I use my PC or laptop both of which have decent Nvidia cards, good CPUs, and ample RAM. I develop my models off line and when ready (if need be) for the final large scale training I use my AWS/Google accounts.

For super large models I go straight to AWS or Google.

19

jcoffi t1_jamq5h8 wrote

You can get around this with "sweat equity". But money is more effective.

I've used Ray.io in the past to connect old laptops with Nvidia GPUs together on a local network to get the job done.

8

[deleted] OP t1_jamvfi6 wrote

Great info, thank you. I am not against paying but it is a shame if you are just started and have skills to write models but cant train them properly.

My laptop is fairly old Lenovo, has no NVIDIA but Radeon graphics card. So it cant use cuda if I am correct.

I wonder how difficult it to set it up for calculations. That card is useful for "gpu calculations with cuda" correct? That is so much faster afaik.

3

Boonzies t1_jamzn2t wrote

There are some AMD GPU chips that might work. But Nvidia is just so much better integrated with TensorFlow and Pytorch and thus the much better hardware.

I equate machine learning to art. You may have great ideas and know-how but at some point you have to buy the material (e.g., good paint, canvas, the better brushes, clay, etc.) to bring your ideas to life.

I know it's frustrating having slow machines when testing models... It really kills the enthusiasm.

I suppose the best thing to do is get into or involved with or work for groups that have the hardware and go from there, or buy new good hardware, or build your own for half price.

4

webauteur t1_jan0grf wrote

GitHub has codespaces for machine learning.

3

DaBobcat t1_janob29 wrote

I'm only familiar with Kaggle/Colab for free GPUs

0

lizelive t1_jans9z8 wrote

Azure ML has compute instances.

1

I_will_delete_myself t1_japdqeh wrote

Use some spot instances on the cloud. It's a lot cheaper than getting a 3k rig unless if you train the model throughout the entire year. You can also connect to a VM on GCP through Colab so everything feels the same, just with a change.

Spot instances and Lambda cloud have super low margins and it isn't the big mark up in traditional products.

1

immo_92_ t1_jaqyv5z wrote

You can use kaggle as well for training your DL models. Others platforms you need to pay for it.

1