Submitted by Cyp9715 t3_11jfqpl in deeplearning

hello? I am considering a graphics card for artificial intelligence learning.

Because there are restrictions on using ROCM with the RX570 graphics card currently being used. I want to get a new NVIDIA graphics card or use Colab. In Korea, the used price of the RTX3070 is less than $ 300, so there seems to be no great difficulty in terms of capital.

Colab is good, but as a result of using it, I think it has some annoying disadvantages compared to local coding. (not fatal) What would you choose?

​

thank you :)

3

Comments

You must log in or register to comment.

karyo t1_jb2qwti wrote

Unless everything you want to try out fits in 3070 memory(8gb), if recommend colab.

1

No_Dust_9578 t1_jb2r3mw wrote

Colab became expensive since they introduced compute units. I was using pro+ and running code for 2-3 hours a day. They give you 500 units per month and a premium gpu that costs 13-15 units per hour. I’d rather get a 3070 and run all I want.

1

I_will_delete_myself t1_jb2zavm wrote

I suggest using Colab free. The resources are more than most people need and use the cloud when you got a serious work load like a business or research.

If you want to do gaming with that then try the rtx 3060 instead. More VRAM let’s you do more than rtx 3070 ironically.

Either paths will eventually lead you to the cloud to be remotely competitive in serious workloads.

6

I_will_delete_myself t1_jb3gzlz wrote

OP's use case though is just looking for a cheap gpu to dabble into. If you have the money for the 3090 then go ahead. However the cloud and Colab is a lot cheaper at the moment until Google decides to screw everyone over in the future.

1

No_Difference9752 t1_jb3z8xg wrote

Get a power supply and 3090. Cheaper than keeping up. Making one now.

1

tsgiannis t1_jb40tzg wrote

3070 should be much much faster than Colab and you have the added bonus of working with full debugging capabilities (PyCharm/Spyder...etc)

Even my 2nd hand 3050 is much faster than Colab...but it is always helpful to have a 2nd machine...so 3070 AND Colab

1

anonynousasdfg t1_jb4athp wrote

Actually runpod looks like a better alternative than colab for cloud GPU rentals.

1

I_will_delete_myself t1_jb524n7 wrote

Develop on your PC first then just use it for a train job. If you have to use it longer than the timeout, the just buy a cloud instance. I have been using Colab for years and never got those issues. Use what it’s meant for and you won’t run into problems.

1

bartzer t1_jb54163 wrote

I suggest to get the 3070 (or similar) for prototyping/testing your ideas. You can reduce VRAM usage by scaling down your data or training with smaller batch size etc. to see if your concept makes sense.

At some point you may run into VRAM or other hardware limitation issues. (you can't train with larger images for example). If that happens you can run training on colab or some other high performance hardware offer.

1

Final-Rush759 t1_jb5aptb wrote

Buy 3060 12GB. 3070 8GB vram has more limitations. Colab is largely not free now. It is fine you are willing to pay for the service. You can also use vast.ai and lambda labs for cloud GPU.

1

incrediblediy t1_jb5dzqa wrote

This is when they were running individually on full 16x PCIE 4.0, can be expected with TFLOPS (3x) as well. (i.e. I have compared times when I had only 3060 vs 3090 on the same slot, running model on a single GPU each time)

I don't do much training on 3060 now, just connected to monitors etc.

I have changed the batch sizes to suit 24 GB anyway as I am working with CV data. Could be bit different with other types of models.

3060 = FP32 (float) 12.74 TFLOPS (https://www.techpowerup.com/gpu-specs/geforce-rtx-3060.c3682)
3090 = FP32 (float) 35.58 TFLOPS (https://www.techpowerup.com/gpu-specs/geforce-rtx-3090.c3622)

I must say 3060 is a wonderful card and helped me a lot until I found this ex-mining 3090. Really worth for the price with 12 GB VRAM.

1