Viewing a single comment thread. View all comments

Dankbubbles123 t1_j9hbbp1 wrote

13

Buck-Nasty t1_j9hch2c wrote

The context window is apparently massive though, more than 10 times the size of gpt3, it could potentially write whole novels at that scale

https://mobile.twitter.com/transitive_bs/status/1628118163874516992?s=46&t=Biiqy66Cy9oPH8c1BL6_JQ

17

hydraofwar t1_j9hgim5 wrote

A credible researcher had commented that ChatGPT can write code, and GPT-4 could write entire programs.

12

GPT-5entient t1_j9hk7td wrote

32k tokens would mean approximately 150 kB of text. That is a decent sized code base! Also with this much context memory the known context saving tricks would work much better so this could be theoretically used to create code bases of virtually unlimited size.

This amazes me and also (being software dev) also scares me...

But, as they say, what a time to be alive!

16

GPT-5entient t1_j9hji5i wrote

Wow, yeah, this looks amazing. My biggest issue with GPT-3 is the relatively small context window. This will open so many new possibilities.

7