Viewing a single comment thread. View all comments

hayAbhay t1_j9i9nfv wrote

If you have the hardware, and if you have a lot of those input-output examples, you can use alternative smaller models in the gpt family.

Should work reasonably well especially if the variance in the input-output isn't too much. (A lot depends on your dataset here)

Definitely tradeoffs here in terms of model dev, inference and maintenance of it. If the expected costs aren't too high, I'd strongly recommend gpt3 as a base.

1