Viewing a single comment thread. View all comments

maxToTheJ t1_izvkm5s wrote

You all are claiming chatGPT has some type of huge memory? How is a 822 limit not indicative of that.

Clarify the claim and how the https://twitter.com/goodside/status/1598882343586238464 applies in that case? You brought that source into the thread and now are claiming the discussion in that thread is off topic?

−1

farmingvillein t1_izvks5s wrote

Are you a bot? The 822 limit has nothing to do with the context window (other than being a lower bound). The tweet thread is talking about an ostensible limit to the prompt description.

2

maxToTheJ t1_izvkxg0 wrote

You brought that source into the thread and now are claiming the discussion in that thread is off topic?

You still havent shown proof that the context window is crazy long for a GPT model. I hope that test case in the thread with a bunch of AAAA's isnt your evidence.

−1

farmingvillein t1_izvlja1 wrote

I linked you to a discussion about the context window. You then proceeded to pull a tweet within that thread which was entirely irrelevant. You clearly have no idea about the underlying issue we are discussing (and/or, again, are some sort of bot-hybrid).

3

maxToTheJ t1_izvm3wq wrote

Dude the freaking logs on chrome show OpenAI concats the prompts.

>You then proceeded to pull a tweet within that thread which was entirely irrelevant

your exact words. Try standing by them

> (other than being a lower bound).

A lower bound is relevant its basic math. Freaking proofs are devoted to setting lower bounds

I am still waiting on any proof of any extraordinary for a GPT3 type model memory . Since it is extremely relevant for explaining something ,is to know it exist in the first place

−2

farmingvillein t1_izvnwdh wrote

...the whole twitter thread, and my direct link to OpenAI, are about the upper bound. The 822 number is irrelevant (given that OpenAI itself tells us that the window is much longer), and the fact that you pulled it tells me that you literally don't understand how transformers or the broader technology works, and that you have zero interest in learning. Are you a Markov chain?

2

maxToTheJ t1_izvotec wrote

> The 822 number is irrelevant (given that OpenAI itself tells us that the window is much longer)

OpenAI says the "cache" is '3000 words (or 4000 tokens)". I dont see anything about the input being that. The test case the poster in the twitter thread with spanish is indicative of input being the lower bound which also aligns with what the base GPT3.5 model in the paper has. The other stress test was trivial

https://help.openai.com/en/articles/6787051-does-chatgpt-remember-what-happened-earlier-in-the-conversation

> ...the whole twitter thread, and my direct link to OpenAI, are about the upper bound.

Details. No hand wavy shit, explain with examples why its longer especially since your position is some magical shit not in the paper/blog is happening.

0

farmingvillein t1_izvq3i8 wrote

> I dont see anything about the input being that.

Again, this has absolutely nothing to do with the discussion here, which is about memory outside of the prompt.

Again, how could you possibly claim this is relevant to the discussion? Only an exceptionally deep lack of conceptual understanding could cause you to make that connection.

4

maxToTheJ t1_izvqh2f wrote

This is boring. I am still waiting on those details.

No hand wavy shit, explain with examples showing its impressively longer especially since your position is some magical shit not in the paper/blog is happening.

1