Submitted by head_robotics t3_1172jrs in MachineLearning
wywywywy t1_j9ar2tk wrote
Reply to comment by xrailgun in [D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM by head_robotics
I did test larger but it didn't run. I can't remember which ones, probably GPT-J. I recently got a 3090 so I can load larger models now.
As for quality, my use case is simple (writing prompt to help with writing stories & articles) and nothing sophisticated, and they worked well. Until ChatGPT came along. I use ChatGPT instead now.
xrailgun t1_j9avboh wrote
Thanks!
I wish model publishers would indicate rough (V)RAM requirements...
wywywywy t1_j9b2kqu wrote
So, not scientific at all, but I've noticed that checkpoint file size * 0.6 is pretty close to actual VRAM requirement for LLM.
But you're right it'd be nice to have a table handy.
Viewing a single comment thread. View all comments