TruthAndDiscipline
TruthAndDiscipline t1_j99vabp wrote
Reply to [D] What matters while running models? by DevarshTare
VRAM has no effect on speed, but if you don't have enough to load model and data, you can't train (CUDA out of memory error).
For performance just look for performance charts.
TruthAndDiscipline t1_j9avex4 wrote
Reply to comment by DevarshTare in [D] What matters while running models? by DevarshTare
I'm using a 3060 (no ti) with 12GB VRAM and train locally as well. Performance is fine, too.