Submitted by AutoModerator t3_yntyhz in MachineLearning
vgaggia t1_ivlbry7 wrote
Reply to comment by SolidMoses in [D] Simple Questions Thread by AutoModerator
A 3070 only has 8gb of vram, this won't be that much for a lot of machine learning applications, if your budget can't fit something like a 3090, your best bet could actually be a 3060(Not the TI!!! It has less vram!), it will be slower, but you'll be able to do more with that 12gb of vram, you might want to wait for the 4060 to be honest though, if anyone else wants to fix any of my errors go ahead but i think you can't do much more than what i said, other than maybe using google colab/runpod or some other gpu cloud service.
​
EDIT:
If your comfortable with second hand and you can find one for a good price, you could also potentially buy a p100, or a T4, off something like ebay, although i can't really recommend this.
Viewing a single comment thread. View all comments