Submitted by imgonnarelph t3_11wqmga in MachineLearning
uspmm2 t1_jd1jh1b wrote
Reply to comment by Straight-Comb-6956 in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
are you talking about the 30b one?
Straight-Comb-6956 t1_jd1srkd wrote
Haven't tried the 30B model. 65B takes 900ms/token on my machine.
msgs t1_jd46yf9 wrote
do you have a link to a torrent/download for the 30B or 65B weights that works with Alpaca.cpp? reddit DMs are fine if don't want to post it publicly.
Genesis_Fractiliza t1_jd8w0b9 wrote
May I also have those please?
msgs t1_jd9fayg wrote
so far I haven't found a download. I'll let you know if I do.
msgs t1_jd9jpvl wrote
https://huggingface.co/Pi3141/alpaca-30B-ggml/tree/main
though I haven't tried to test it yet.
Viewing a single comment thread. View all comments