Submitted by Cyp9715 t3_11jfqpl in deeplearning

hello? I am considering a graphics card for artificial intelligence learning.

Because there are restrictions on using ROCM with the RX570 graphics card currently being used. I want to get a new NVIDIA graphics card or use Colab. In Korea, the used price of the RTX3070 is less than $ 300, so there seems to be no great difficulty in terms of capital.

Colab is good, but as a result of using it, I think it has some annoying disadvantages compared to local coding. (not fatal) What would you choose?

​

thank you :)

3

Comments

You must log in or register to comment.

I_will_delete_myself t1_jb2zavm wrote

I suggest using Colab free. The resources are more than most people need and use the cloud when you got a serious work load like a business or research.

If you want to do gaming with that then try the rtx 3060 instead. More VRAM let’s you do more than rtx 3070 ironically.

Either paths will eventually lead you to the cloud to be remotely competitive in serious workloads.

6

xRaptorGG t1_jb3tjgt wrote

Whenever I try to connect to a GPU on Colab, I get a GPU limit message. This has been the case for the last 3 weeks

2

I_will_delete_myself t1_jb524n7 wrote

Develop on your PC first then just use it for a train job. If you have to use it longer than the timeout, the just buy a cloud instance. I have been using Colab for years and never got those issues. Use what it’s meant for and you won’t run into problems.

1

xRaptorGG t1_jb55a1a wrote

I will be buying a 4070 Ti this summer, and wanted to use Colab till then. But no luck getting a GPU

1

I_will_delete_myself t1_jb55g8o wrote

You might get a better deal by getting and rtx 3090. Double the VRAM for around the same price. It’s way to easy to hit the VRAM limit on a gpu.

2

xRaptorGG t1_jb5alm6 wrote

in my country 4070 ti is like 1000$ and 3090 is 1600-1700$ i can’t afford that expensive GPU

1

I_will_delete_myself t1_jb5do6p wrote

Go to the used market. Buying brand new cards will be more expensive because those are scalpers. Second hand is much more reasonable.

1

xRaptorGG t1_jb5wh4g wrote

I am worried of getting a mined card that might die the next day

1

Cyp9715 OP t1_jb30vxg wrote

Thanks to you, I found that the RTX 3060 has 12GB VRAM. I'll consider it.

1

karyo t1_jb2qwti wrote

Unless everything you want to try out fits in 3070 memory(8gb), if recommend colab.

1

Cyp9715 OP t1_jb2x4ak wrote

Thank you for your kind comments.

1

No_Dust_9578 t1_jb2r3mw wrote

Colab became expensive since they introduced compute units. I was using pro+ and running code for 2-3 hours a day. They give you 500 units per month and a premium gpu that costs 13-15 units per hour. I’d rather get a 3070 and run all I want.

1

Cyp9715 OP t1_jb2xcry wrote

Thank you for your kind comments. I'm seriously considering buying the RTX3070, and I'll probably keep it within this week.

1

incrediblediy t1_jb2tdir wrote

can you find an used RTX3090 ?

1

Cyp9715 OP t1_jb2wxsd wrote

I can find it, but installing the RTX3090 requires changing the power supply as well, so I'm trying to compromise with the 3070.

1

I_will_delete_myself t1_jb2z5ju wrote

3060 is better. The vram let’s you get more stuff done

1

incrediblediy t1_jb3g1qw wrote

I have dual 3090 + 3060 setup running on 850 W PSU. 3090 is about 4x speed of 3060

1

I_will_delete_myself t1_jb3gzlz wrote

OP's use case though is just looking for a cheap gpu to dabble into. If you have the money for the 3090 then go ahead. However the cloud and Colab is a lot cheaper at the moment until Google decides to screw everyone over in the future.

1

Final-Rush759 t1_jb5bxf5 wrote

Only 2×more than 3060. May be you are more power limited or CPU bottle necked when using both GPUs, or PCEi bandwidth limited.

1

incrediblediy t1_jb5dzqa wrote

This is when they were running individually on full 16x PCIE 4.0, can be expected with TFLOPS (3x) as well. (i.e. I have compared times when I had only 3060 vs 3090 on the same slot, running model on a single GPU each time)

I don't do much training on 3060 now, just connected to monitors etc.

I have changed the batch sizes to suit 24 GB anyway as I am working with CV data. Could be bit different with other types of models.

3060 = FP32 (float) 12.74 TFLOPS (https://www.techpowerup.com/gpu-specs/geforce-rtx-3060.c3682)
3090 = FP32 (float) 35.58 TFLOPS (https://www.techpowerup.com/gpu-specs/geforce-rtx-3090.c3622)

I must say 3060 is a wonderful card and helped me a lot until I found this ex-mining 3090. Really worth for the price with 12 GB VRAM.

1

Final-Rush759 t1_jb5f7eu wrote

I used mix precision training, should have been largely fp16. But you can input as float32. Pytorch amp will auto cast to fp16. I only get 2x speed more with 3090.

1

incrediblediy t1_jb3fxyt wrote

power supplies are quite cheap now, probably around $125 for 850 W.

1

No_Difference9752 t1_jb3z8xg wrote

Get a power supply and 3090. Cheaper than keeping up. Making one now.

1

tsgiannis t1_jb40tzg wrote

3070 should be much much faster than Colab and you have the added bonus of working with full debugging capabilities (PyCharm/Spyder...etc)

Even my 2nd hand 3050 is much faster than Colab...but it is always helpful to have a 2nd machine...so 3070 AND Colab

1

anonynousasdfg t1_jb4athp wrote

Actually runpod looks like a better alternative than colab for cloud GPU rentals.

1

bartzer t1_jb54163 wrote

I suggest to get the 3070 (or similar) for prototyping/testing your ideas. You can reduce VRAM usage by scaling down your data or training with smaller batch size etc. to see if your concept makes sense.

At some point you may run into VRAM or other hardware limitation issues. (you can't train with larger images for example). If that happens you can run training on colab or some other high performance hardware offer.

1

Final-Rush759 t1_jb5aptb wrote

Buy 3060 12GB. 3070 8GB vram has more limitations. Colab is largely not free now. It is fine you are willing to pay for the service. You can also use vast.ai and lambda labs for cloud GPU.

1