Submitted by _underlines_ t3_zstequ in MachineLearning
pilibitti t1_j1aimsd wrote
Reply to comment by judasblue in [D] When chatGPT stops being free: Run SOTA LLM in cloud by _underlines_
I think they price by generated token in their other products? if so there should be a way to make chatgpt less verbose out of the box.
also this stuff will be a lot more popular than the other products but the hardware power isn't really there for such demand using older prices I assume. So it might be a bit more expensive than their other offerings.
judasblue t1_j1akvas wrote
Oh, I was just pointing out that 1000 tokens in their base model for other services is 0.0004, so an order of mag lower than u/coolbreeze770 was guessing. In other words, pretty friggin cheap for most since a rough way to think about it is three tokens equaling two words on average.
edited for clunky wording
f10101 t1_j1cmsob wrote
Just in case you miss my other comment - chatgpt seems to actually be particularly expensive to run in comparison to their other apis. Altman says "single digit cents per chat".
Viewing a single comment thread. View all comments