vgaggia
vgaggia t1_ivlb9ee wrote
Reply to [D] Simple Questions Thread by AutoModerator
Where do you see Machine Learning being in the next 2 years, also does it just take trial and error for you guys to get such consistently better results from version updated i.e 1.4 >> v1.5
vgaggia t1_ivlbry7 wrote
Reply to comment by SolidMoses in [D] Simple Questions Thread by AutoModerator
A 3070 only has 8gb of vram, this won't be that much for a lot of machine learning applications, if your budget can't fit something like a 3090, your best bet could actually be a 3060(Not the TI!!! It has less vram!), it will be slower, but you'll be able to do more with that 12gb of vram, you might want to wait for the 4060 to be honest though, if anyone else wants to fix any of my errors go ahead but i think you can't do much more than what i said, other than maybe using google colab/runpod or some other gpu cloud service.
​
EDIT:
If your comfortable with second hand and you can find one for a good price, you could also potentially buy a p100, or a T4, off something like ebay, although i can't really recommend this.