Viewing a single comment thread. View all comments

hydraofwar t1_j9hgim5 wrote

A credible researcher had commented that ChatGPT can write code, and GPT-4 could write entire programs.

12

GPT-5entient t1_j9hk7td wrote

32k tokens would mean approximately 150 kB of text. That is a decent sized code base! Also with this much context memory the known context saving tricks would work much better so this could be theoretically used to create code bases of virtually unlimited size.

This amazes me and also (being software dev) also scares me...

But, as they say, what a time to be alive!

16