Submitted by radi-cho t3_11izjc1 in MachineLearning
WikiSummarizerBot t1_jb1uuaw wrote
Reply to comment by JEFFREY_EPSTElN in [R] [N] Dropout Reduces Underfitting - Liu et al. by radi-cho
>Dilution and dropout (also called DropConnect) are regularization techniques for reducing overfitting in artificial neural networks by preventing complex co-adaptations on training data. They are an efficient way of performing model averaging with neural networks. Dilution refers to thinning weights, while dropout refers to randomly "dropping out", or omitting, units (both hidden and visible) during the training process of a neural network. Both trigger the same type of regularization.
^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)
[deleted] t1_jb1xcfg wrote
[removed]
Viewing a single comment thread. View all comments