Submitted by radi-cho t3_11izjc1 in MachineLearning
Toast119 t1_jb16zt9 wrote
Reply to comment by Chadssuck222 in [R] [N] Dropout Reduces Underfitting - Liu et al. by radi-cho
I think it's because Dropout is usually seen as a method for reducing overfitting and this paper is claiming and supporting that it is also useful for reducing underfittting as well.
farmingvillein t1_jb18evq wrote
Yes. In the first two lines of the abstract:
> Introduced by Hinton et al. in 2012, dropout has stood the test of time as a regularizer for preventing overfitting in neural networks. In this study, we demonstrate that dropout can also mitigate underfitting when used at the start of training.
[deleted] t1_jb19gqq wrote
[deleted]
Viewing a single comment thread. View all comments