Submitted by democracyab t3_z9isty in deeplearning
ApplicationBoth1829 t1_iyi2jon wrote
2060 with 12g vram definitely better.
Tesla p40 is also a good choice,it is slower but 24g vram for less than 150$(modification required).
C0demunkee t1_iyipj52 wrote
I've got an M40 and love it
mr_birrd t1_iyjlsb4 wrote
p40 for less than 150???
ApplicationBoth1829 t1_iylhuux wrote
As far as i know, yes. At least in China. You can get p40 for about 130$.
Dont worry, its not fake, it just come from old server retired by the company.
mr_birrd t1_iyln0js wrote
Just realised no mixed precision on a p40 so there you go..
ApplicationBoth1829 t1_iylqr22 wrote
fact
ApplicationBoth1829 t1_iylijsp wrote
In China, electronic components are much cheaper, so creazy thing always happens.
I just modified my 2080ti to 22g and it worked just fine.
Viewing a single comment thread. View all comments