Viewing a single comment thread. View all comments

IntelArtiGen t1_j4zr3iq wrote

Yeah that's also what I would say, I doubt it's anything revolutionary as it's likely not necessary. It might be an innovative use of embeddings of a conversation but I wouldn't qualify that as "revolutionary".

They probably don't use only one embedding for the whole conv, perhaps they use one embedding per prompt and/or they keep in memory some tokens.

1

MysteryInc152 t1_j50pw6e wrote

With embeddings, it should theoritically not have a hard limit at all. But experiments here suggest a sliding context window of 8096

https://mobile.twitter.com/goodside/status/1598874674204618753?t=70_OKsoGYAx8MY38ydXMAA&s=19

6

Daos-Lies t1_j50vdq9 wrote

That is indeed fair enough.

Big fan of the concept of screaming at it until it forgets ;)

And I suppose it is very possible that as part of my 'v long conversations with it' if the topic of the conversation repeated at any stage, which I'm sure they would have done at points, then that could have fooled me into thinking it was remembering things from right at the start.

2