buildbot_win

buildbot_win t1_iuqf15z wrote

Oh hey this is is a neat justification for this type of pruning, I did my masters on this topic! - The technique we came up with was called Dropback, basically resetting the weights to their initial values instead of zero, used in combination with a pseudo rng so you can reinit weights on the fly deterministically. You only had to track a few million parameters out of tens of millions to achieve similar accuracy. https://mlsys.org/Conferences/2019/doc/2019/135.pdf

13