Viewing a single comment thread. View all comments

maxToTheJ t1_izvotec wrote

> The 822 number is irrelevant (given that OpenAI itself tells us that the window is much longer)

OpenAI says the "cache" is '3000 words (or 4000 tokens)". I dont see anything about the input being that. The test case the poster in the twitter thread with spanish is indicative of input being the lower bound which also aligns with what the base GPT3.5 model in the paper has. The other stress test was trivial

https://help.openai.com/en/articles/6787051-does-chatgpt-remember-what-happened-earlier-in-the-conversation

> ...the whole twitter thread, and my direct link to OpenAI, are about the upper bound.

Details. No hand wavy shit, explain with examples why its longer especially since your position is some magical shit not in the paper/blog is happening.

0

farmingvillein t1_izvq3i8 wrote

> I dont see anything about the input being that.

Again, this has absolutely nothing to do with the discussion here, which is about memory outside of the prompt.

Again, how could you possibly claim this is relevant to the discussion? Only an exceptionally deep lack of conceptual understanding could cause you to make that connection.

4

maxToTheJ t1_izvqh2f wrote

This is boring. I am still waiting on those details.

No hand wavy shit, explain with examples showing its impressively longer especially since your position is some magical shit not in the paper/blog is happening.

1