Viewing a single comment thread. View all comments

CowardlyVelociraptor t1_j7025ho wrote

You're paying a premium for the nice UI

1

2blazen t1_j70vh2g wrote

Might be just me, but I really hate how the reply is returned in the UI. Even if the subscription will solve the random interruptions during generation, the word-by-word printing kills me, I'd rather wait a bit but receive my answer in one piece

1

danielbln t1_j7c9mpc wrote

I much prefer to see the tokens as they are generated, it's much better UX as you can abort the generation if you feel it's not going in the right direction. All my GPT3 integrations use stream:true and display every word as it comes in.

1