#
**BrisklyBrusque**
t1_j43dsux wrote

You might enjoy “Well-Tuned Simple Nets Excel on Tabular Data”

https://arxiv.org/abs/2106.11189

Authors wrote a computer routine that leverages BOHB (Bayesian optimization and Hyberband) to search an enormous search space of possible neural network architectures. The authors allowed the routine to select different regularization techniques, including many ensemble techniques like dropout, snapshot ensembles, and others that render the choice of parameter initializations less critical. However, authors used the same optimizer (AdamW) in all experiments.

Not exactly what you are looking for but hopefully interesting.

#
**Decadz**
OP
t1_j45h9oj wrote

Thanks for the suggestion, I’ll take a read!

Viewing a single comment thread. View all comments