Submitted by gokul113 t3_xsjrws in deeplearning
Karyo_Ten t1_iqnb0i2 wrote
Neither.
Mac M1 for deep learning? No nvidia GPU, no deep learning, and before people pull pitchforks about PyTorch and Tensorflow supporting M1, it's a pain and many ecosystem packages only support Cuda. And recompiling everything is a time sink.
The RTX 2060 is a bit of a bummer when 3060 12GB is a clean upgrade for not much more, 6GB is getting small these days, and you didn't mention the RAM? 16GB minimum just to have our browser, VScode, and Discord/Slack or whatever you use to communicate and then your model.
Viewing a single comment thread. View all comments