Viewing a single comment thread. View all comments

judasblue t1_j1af1ug wrote

That's high by an order of mag :)

5

pilibitti t1_j1aimsd wrote

I think they price by generated token in their other products? if so there should be a way to make chatgpt less verbose out of the box.

also this stuff will be a lot more popular than the other products but the hardware power isn't really there for such demand using older prices I assume. So it might be a bit more expensive than their other offerings.

3

judasblue t1_j1akvas wrote

Oh, I was just pointing out that 1000 tokens in their base model for other services is 0.0004, so an order of mag lower than u/coolbreeze770 was guessing. In other words, pretty friggin cheap for most since a rough way to think about it is three tokens equaling two words on average.

edited for clunky wording

3

f10101 t1_j1cmsob wrote

Just in case you miss my other comment - chatgpt seems to actually be particularly expensive to run in comparison to their other apis. Altman says "single digit cents per chat".

1