Submitted by head_robotics t3_1172jrs in MachineLearning
xrailgun t1_j9dtp9c wrote
Reply to comment by Emergency_Apricot_77 in [D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM by head_robotics
It might not be unreasonable to think maybe OP primarily wants the functionality of current LLMs, and if something can provide that more efficiently (or has promise to in the near future), s/he may want to know about it too.
Viewing a single comment thread. View all comments