Viewing a single comment thread. View all comments

jagedlion t1_j8kbruo wrote

Part of model building is that it compresses well and doesn't need to store the original data. It consumed 45TB of internet, and stores it in its 700GB working memory (the inference engine can be stored in less space, but I cant pin down a specific minimal number).

It has to figure out what's worth remembering (and how to remember it) without access to the test. It studied the general knowledge, but it didn't study for this particular exam.

2