Viewing a single comment thread. View all comments

GPT-5entient t1_j9l4ex1 wrote

32k tokens would mean approximately 150 kB of text. That is a decent sized code base! Also with this much context memory the known context saving tricks would work much better so this could be theoretically used to create code bases of virtually unlimited size.
This amazes me and also (being a software dev) also scares me...
But, as they say, what a time to be alive!

5