Submitted by Oceanboi t3_z30bf2 in MachineLearning
Oceanboi OP t1_ixlzu9h wrote
Reply to comment by hadaev in [D] Transfer Learning of Image Trained Network in Audio Domain by Oceanboi
Maybe not always, but couldn't you argue that good trained weights for one task may not carry over well to another?
asdfzzz2 t1_ixm6knh wrote
Is there any reason for them to be worse than random weights? Because if not, then you have no reason not to use pretrained models.
hadaev t1_ixmajnt wrote
To add to it, non random weight might be worse for tiny/simple models.
But modern vision models should be fine with it.
Like bert text weights is a good starting point for images classification.
Viewing a single comment thread. View all comments