Submitted by head_robotics t3_1172jrs in MachineLearning
xrailgun t1_j9avboh wrote
Reply to comment by wywywywy in [D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM by head_robotics
Thanks!
I wish model publishers would indicate rough (V)RAM requirements...
wywywywy t1_j9b2kqu wrote
So, not scientific at all, but I've noticed that checkpoint file size * 0.6 is pretty close to actual VRAM requirement for LLM.
But you're right it'd be nice to have a table handy.
Viewing a single comment thread. View all comments