Viewing a single comment thread. View all comments

tell-me-the-truth- t1_j45e4gv wrote

yeah I can see the point behind ML definition.. i guess i was trying to say you don’t always get better with more data. the performance might saturate at some point or the new data you add could be garbage.. so i found it a bit odd to tie definition of ML to the quantity of data.. the definition you linked talks about experience.. i’m not sure how it’s defined.


VirtualHat t1_j45em2b wrote

Yes true! Most models will eventually saturate and perhaps and even become worse. I guess it's our job then to just make the algorithms better :). A great example of this is the new Large Langauge Models (LLM), which are trained on billions if not trillions of tokens, and still keep getting better :)