Viewing a single comment thread. View all comments

semitope t1_j8k09qi wrote

sounds about the same thing. given the data before vs looking for it now. Fact is it cannot produce useful responses when it comes to facts without exposure to the data. Would be like someone talking about something they know absolutely nothing about. Which might be why sometimes it's accused of making things up confidently.

0

jagedlion t1_j8k0uqa wrote

I mean, humans can't either give you information that they don't have exposure to. We just acquire more data during our normal day to day lives. People also do their best to infer from what they know. They are more willing to code their certainty in their language, sure, but humans also can only work off of the knowledge they have and the connections they can find within.

4

semitope t1_j8k5n5n wrote

humans aside, saying it doesn't need to acquire additional information from the internet or elsewhere isn't saying much if it already acquired the information from the internet and elsewhere. It already studied for the exam

0

jagedlion t1_j8kbruo wrote

Part of model building is that it compresses well and doesn't need to store the original data. It consumed 45TB of internet, and stores it in its 700GB working memory (the inference engine can be stored in less space, but I cant pin down a specific minimal number).

It has to figure out what's worth remembering (and how to remember it) without access to the test. It studied the general knowledge, but it didn't study for this particular exam.

2