Submitted by radi-cho t3_11izjc1 in MachineLearning
JEFFREY_EPSTElN t1_jb1usu4 wrote
Reply to comment by xXWarMachineRoXx in [R] [N] Dropout Reduces Underfitting - Liu et al. by radi-cho
-
Research, News
-
A regularization technique for training neural networks https://en.wikipedia.org/wiki/Dilution_(neural_networks)
WikiSummarizerBot t1_jb1uuaw wrote
>Dilution and dropout (also called DropConnect) are regularization techniques for reducing overfitting in artificial neural networks by preventing complex co-adaptations on training data. They are an efficient way of performing model averaging with neural networks. Dilution refers to thinning weights, while dropout refers to randomly "dropping out", or omitting, units (both hidden and visible) during the training process of a neural network. Both trigger the same type of regularization.
^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)
[deleted] t1_jb1xcfg wrote
[removed]
[deleted] t1_jb1xdiw wrote
[removed]
Viewing a single comment thread. View all comments