buildbot_win t1_iuqf15z wrote
Oh hey this is is a neat justification for this type of pruning, I did my masters on this topic! - The technique we came up with was called Dropback, basically resetting the weights to their initial values instead of zero, used in combination with a pseudo rng so you can reinit weights on the fly deterministically. You only had to track a few million parameters out of tens of millions to achieve similar accuracy. https://mlsys.org/Conferences/2019/doc/2019/135.pdf
polandtown t1_ius2vtx wrote
very cool, what do you mean by pseudo rng? Like picking random weight values for a previously established pool/range?
LetterRip t1_iusm58q wrote
pseudo rngs produce deterministic results from a given seed, so aren't truly random. But have a statistical distribution matching true randomness.
Viewing a single comment thread. View all comments