Viewing a single comment thread. View all comments

Old_Pea3923 t1_j0kple3 wrote

I thought the context window of gpt3 was 2048 or 4000 tokens so how does chatGPT work?

1

Old_Pea3923 t1_j0kq93b wrote

"While ChatGPT is able to remember what the user has said earlier in the conversation, there is a limit to how much information it can retain. The model is able to reference up to approximately 3000 words (or 4000 tokens) from the current conversation - any information beyond that is not stored.
Please note that ChatGPT is not able to access past conversations to inform its responses." - https://help.openai.com/en/articles/6787051-does-chatgpt-remember-what-happened-earlier-in-the-conversation

My question then is how does it do this?

1