Submitted by imgonnarelph t3_11wqmga in MachineLearning
C0demunkee t1_jd8svm2 wrote
Reply to comment by whyvitamins in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
Tesla P40 24gb VRAM, $150 only 1 or 2 gen behind the 3090
[deleted] t1_jd8utna wrote
[removed]
tOSUfever t1_jdfj8k9 wrote
where are you finding 24gb p40's for $150?
C0demunkee t1_jdg030x wrote
eeeeeeebay
Maybe $200 on a bad day, but still far better than anything newer
Viewing a single comment thread. View all comments