[D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM Submitted by head_robotics t3_1172jrs on February 20, 2023 at 9:33 AM in MachineLearning 51 comments 220
gpt-doktor-6b t1_j9b3u79 wrote on February 20, 2023 at 5:05 PM You might be interested in this tutorial on loading large models. They promise you the ability to inference model as long as you have enough disk space. https://huggingface.co/blog/accelerate-large-models Permalink 21
Viewing a single comment thread. View all comments