Submitted by imgonnarelph t3_11wqmga in MachineLearning
gliptic t1_jd2bsc7 wrote
Reply to comment by lurkinginboston in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
In fact, GPT3 is 175B. But GPT3 is old now and doesn't make effective use of those parameters.
Viewing a single comment thread. View all comments