Submitted by imgonnarelph t3_11wqmga in MachineLearning
whyvitamins t1_jd1mddg wrote
Reply to comment by currentscurrents in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
realistically, what's the cheapest one can get a used functioning 3090 rn? like 700 usd minimum?
C0demunkee t1_jd8svm2 wrote
Tesla P40 24gb VRAM, $150 only 1 or 2 gen behind the 3090
[deleted] t1_jd8utna wrote
[removed]
tOSUfever t1_jdfj8k9 wrote
where are you finding 24gb p40's for $150?
C0demunkee t1_jdg030x wrote
eeeeeeebay
Maybe $200 on a bad day, but still far better than anything newer
Viewing a single comment thread. View all comments