[D] Have you ever used Knowledge Distillation in practice? Submitted by fredlafrite t3_106no9h on January 8, 2023 at 4:43 PM in MachineLearning 13 comments 9
fredlafrite OP t1_j3l6b40 wrote on January 9, 2023 at 9:01 AM Reply to comment by madmax_br5 in [D] Have you ever used Knowledge Distillation in practice? by fredlafrite Nice! There are 20MB versions of BERT, super interesting thank you! Permalink Parent 1
Viewing a single comment thread. View all comments