Viewing a single comment thread. View all comments

tsgiannis OP t1_j4wk889 wrote

>Less data means more underspecification and thus the model more readily gets stuck in local minima

Probably this is the answer to the my "why".

1

I_will_delete_myself t1_j4ylmkp wrote

He just said why. It's because there isn't a diverse and large amount data you are training on. Imaginet was trained on many different kind of objects (over a million images) and while your toy dataset may probably only have 50-100k.

2