Viewing a single comment thread. View all comments

wally1002 t1_jcx6232 wrote

For deeplearning higher VRAM is always preferable. 12/16GB limits the kind of models you can run/infer. With LLMs getting democratised it's better to be future proof.

12

mrcet007 t1_jcxgyl7 wrote

12/16gb is already hitting limit of what's available on market for consumer gaming GPU. Only GPU for deeplearning with more than 16bgb is 4090 which is already out of range for most individual at $1500

−4