Submitted by nashcaps2724 t3_117l2vf in deeplearning
hayAbhay t1_j9i9nfv wrote
Reply to comment by nail_nail in Fine tuning a GPT for text generation by nashcaps2724
If you have the hardware, and if you have a lot of those input-output examples, you can use alternative smaller models in the gpt family.
Should work reasonably well especially if the variance in the input-output isn't too much. (A lot depends on your dataset here)
Definitely tradeoffs here in terms of model dev, inference and maintenance of it. If the expected costs aren't too high, I'd strongly recommend gpt3 as a base.
Viewing a single comment thread. View all comments