Viewing a single comment thread. View all comments

ewankenobi t1_j9jl91t wrote

I like your wording, did you come up with that definition yourself or is it from a paper?

1

yldedly t1_j9jorh1 wrote

It's not from a paper, but it's pretty uncontroversial I think - though people like to forget about the "bounded interval" part, or at least what it implies about extrapolation.

9

[deleted] t1_j9jsgf6 wrote

What is "bounded interval" here?

1

yldedly t1_j9judc7 wrote

Any interval [a; b] where a and b are numbers. In practice, it means that the approximation will be good in the parts of the domain where there is training data. I have a concrete example in a blog post of mine: https://deoxyribose.github.io/No-Shortcuts-to-Knowledge/

8

[deleted] t1_j9jukfu wrote

Interesting but that is valid for us as well. So I am not sure this is true once they learn very general things, like learning itself.

0