Viewing a single comment thread. View all comments

juliensalinas OP t1_jckwtdj wrote

No if you want such a model to "remember" previous prompts you will need to add them at the top of each requests you are making.

The output can be up to 2048 tokens. But on a Tesla T4 you might not have enough VRAM so maybe you will be limited to 1024 tokens because the GPU will run out of memory above that.

9

Necessary_Ad_9800 t1_jckxh1h wrote

Thanks for the answer. Is 1 letter equal to 1 token?

1

juliensalinas OP t1_jcky8ok wrote

You're welcome.

A token is a unique entity that can either be a small word, part of a word, or punctuation.
On average, 1 token is made up of 4 characters, and 100 tokens are roughly equivalent to 75 words.
Natural Language Processing models need to turn your text into tokens in order to process it.

9