Viewing a single comment thread. View all comments

I_will_delete_myself t1_itv85fs wrote

I know it's more "cool" to use PyTorch, however they are practically performing very similar math and should get very similar results. This is if you decide to train something from scratch. What's more important is the person using the tool rather than just the tools themselves.

​

Edit: Also their VGG16 weights are probably going to be different than Tensorflow's, so it isn't an accurate representation. You should try a model trained from scratch.

1

aleguida OP t1_itvhaee wrote

thanks for the feedback. Turning off the pretraining causes pytorch to learn more slowly (to be expected) but TF is stuck and not learning anything. See colabs notebooks below.

I see many other TF implementation adding a few more FC layers on the top of the VGG16 but as you also stated I would expect to see the same problem in pytorch while I am kind of getting different results with a similar network. I will try next to build a CNN from scratch using the very same layers for both frameworks

​

TF: https://colab.research.google.com/drive/1O6qzopiFzK5tDmLQAzLKmNoNaEMDc4Ze?usp=sharing
PYTORCH: https://colab.research.google.com/drive/1g-1CEpzmWJi9xOiZHzvDSv_-eDlXZO9u?usp=sharing

1