[D] llama 7b vs 65b ? Submitted by deck4242 t3_125q87z on March 29, 2023 at 2:42 PM in MachineLearning 9 comments 7
ortegaalfredo t1_jegn9zu wrote on March 31, 2023 at 9:42 PM Reply to comment by machineko in [D] llama 7b vs 65b ? by deck4242 2x3090, 65B is using int4, 30B is using int8 (required for LoRA) Permalink Parent 2
Viewing a single comment thread. View all comments