Viewing a single comment thread. View all comments

coolbreeze770 t1_j1a8mo0 wrote

Or just pay .004c per api query? And open AI will allow you to fine tune their model to your own needs

Edit: I dont know the precise cost just pulled that number out of my ass

15

judasblue t1_j1af1ug wrote

That's high by an order of mag :)

5

pilibitti t1_j1aimsd wrote

I think they price by generated token in their other products? if so there should be a way to make chatgpt less verbose out of the box.

also this stuff will be a lot more popular than the other products but the hardware power isn't really there for such demand using older prices I assume. So it might be a bit more expensive than their other offerings.

3

judasblue t1_j1akvas wrote

Oh, I was just pointing out that 1000 tokens in their base model for other services is 0.0004, so an order of mag lower than u/coolbreeze770 was guessing. In other words, pretty friggin cheap for most since a rough way to think about it is three tokens equaling two words on average.

edited for clunky wording

3

f10101 t1_j1cmsob wrote

Just in case you miss my other comment - chatgpt seems to actually be particularly expensive to run in comparison to their other apis. Altman says "single digit cents per chat".

1

caedin8 t1_j1asist wrote

As soon as we can fine tune it to our problem space, we are 100% putting it as a help bot in our commercial software. It’s ready, it just needs tuning.

2

IWantAGrapeInMyMouth t1_j1b1vl8 wrote

I imagine there’ll be open source versions of ChatGPT in the near future given it’s wild popularity, I’ll probably just use that for personal projects, and in a business setting I would just have a dedicated model of that open source version running. .004 cents per 1000 tokens (or much less) is a hell of an ask if you’re doing anything where users generate tokens

2

sanman t1_j1bakvl wrote

Open Source is only free when it's running off your own computer. Otherwise, if it's running off some infrastructure, then that has to be paid for - typically with ads or something like that.

12

IWantAGrapeInMyMouth t1_j1bzofo wrote

Usually inference on hugging face for large models is free for individuals making a reasonable amount of API calls as part of their offerings, and I assume an open source version of this would be on there. I realize that it costs money.

3