currentscurrents t1_j8op44d wrote
Does it though? There was a reproducibility survey recently that found that many optimizers claiming better performance did not in fact work for anything other than the tasks tested in their papers.
Essentially they were doing hyperparameter tuning - just the hyperparameter was the optimizer design itself.
Seankala t1_j8r2317 wrote
> ...just the hyperparameter was the optimizer design itself.
Probably one of the best things I've read today lol. Reminds me of when old colleagues of mine would have lists of different PyTorch optimizers and just loop through them.
Viewing a single comment thread. View all comments