Submitted by CrazyCrab t3_zgvohh in MachineLearning
CrazyCrab OP t1_izisk28 wrote
Reply to comment by Latter_Security9389 in [D] Did I overfit to val by choosing the best checkpoint? by CrazyCrab
I think with this few images I can't afford having a test set. Also, I thought that since I have approximately 50 million pixels to classify in the validation dataset, and given that computer vision practicioners often don't have a test split, I don't really need a test split. Now I'm not sure.
plocco-tocco t1_iziu4zy wrote
Do 5 or 10 fold cross validation in this case. Often used when there is not a lot of data.
CrazyCrab OP t1_iziuhq4 wrote
Do you suggest doing cross validation with the training stopping mechanism "train for precisely the same number of steps I did in this run" or with "train using checkpointing and choosing the best checkpoint as I did in this run"?
plocco-tocco t1_izj4iy8 wrote
I would take the best checkpoints (aka when the validation loss starts diverging from the training loss). Not the same number of steps because it can happen that the networks don't converge to a minima at the same time, some may be stuck somewhere for longer.
Viewing a single comment thread. View all comments