whyvitamins
whyvitamins t1_jd0j5zq wrote
Reply to comment by currentscurrents in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
> hope that AMD gets their act together on AI support
walking around picking up coins from the ground to buy a 3090 should be faster honestly
whyvitamins t1_j2sxlc2 wrote
Reply to comment by Taenk in [R] Massive Language Models Can Be Accurately Pruned in One-Shot by starstruckmon
🙏😊
whyvitamins t1_jd1mddg wrote
Reply to comment by currentscurrents in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
realistically, what's the cheapest one can get a used functioning 3090 rn? like 700 usd minimum?