LetMeGuessYourAlts

LetMeGuessYourAlts t1_jbbw4d4 wrote

I just bought a used 3090 for $740 on eBay before tax. I view my GPU as a for-fun expenditure. Part of that is ML stuff. For the cost of a handful of new release videogames, you can go from 10gb to 24gb and do a lot of cool stuff. There's going to be increasingly less state of the art stuff that fits in 10gb comfortably.

2

LetMeGuessYourAlts t1_j035ugy wrote

And if you carry a similar prompt over to the playground and run it on a davinci-003 model it will still attempt to answer your question without just giving up like that, so it's likely outside the model itself producing that response and then just having the model complete the error message. I was wondering if perhaps if confidence was low if it just defaults to an "I'm sorry..." and then let's the model produce the error.

1

LetMeGuessYourAlts t1_iyruft9 wrote

Do you know: Are there any Nvidia GPUs at a decent price/performance point that can pool memory? Every avenue I've looked down seems to point to nothing a hobbyist could afford being able to get a large amount of memory without resorting to old workstation GPUs that have relatively slow processors. Best bet seems to be a single 3090 if memory is the priority?

1