Submitted by Business-Lead2679 t3_12618zu in MachineLearning
machineko t1_jecvhyt wrote
Reply to comment by Evening_Ad6637 in [D] Training a 65b LLaMA model by Business-Lead2679
16gb of RAM is not enough for even the smallest LLaMA 7b model. You can try doing LoRA with int8 listed above. Did you try the python script I linked above?
Viewing a single comment thread. View all comments