Submitted by Oceanboi t3_z30bf2 in MachineLearning
hadaev t1_ixlti0s wrote
> and you may be better off starting from scratch.
Basically you compare random wights and good trained weights. Why latter should be worst?
Oceanboi OP t1_ixlzu9h wrote
Maybe not always, but couldn't you argue that good trained weights for one task may not carry over well to another?
asdfzzz2 t1_ixm6knh wrote
Is there any reason for them to be worse than random weights? Because if not, then you have no reason not to use pretrained models.
hadaev t1_ixmajnt wrote
To add to it, non random weight might be worse for tiny/simple models.
But modern vision models should be fine with it.
Like bert text weights is a good starting point for images classification.
Viewing a single comment thread. View all comments