nibbajenkem
nibbajenkem t1_ja5568v wrote
Reply to How would you approach this task? by JJ_00ne
Doesnt seem like anything you need deep learning for
nibbajenkem t1_j8ojasp wrote
Reply to comment by canbooo in Physics-Informed Neural Networks by vadhavaniyafaijan
Of course, more inductive biases trivially lead to better generalization. Its just not clear to me why you cannot forego the neural network and all its weaknesses and instead simply optimize the coefficients of the physical model itself. I.e in the example in OP, why have a physics-based loss with a prior that it's a damped oscillator instead of just doing regular interpolation on whatever functional class(es) describe the damped oscillators?
I don't have much physics expertise beyond the basics so I might be misunderstanding the true depth of the problem though
nibbajenkem t1_j8mpv71 wrote
Reply to Physics-Informed Neural Networks by vadhavaniyafaijan
What is the use case if it is already appropriate to model the phenomenon using regular physics?
nibbajenkem t1_j5yuece wrote
Doesn't matter. The bias can be negative if that is what the model learns
nibbajenkem t1_j4wii8d wrote
Reply to Why a pretrained model returns better accuracy than the implementation from scratch by tsgiannis
It's pretty simple. Deep neural networks are extremely underspecified by the data they train on https://arxiv.org/abs/2011.03395. Less data means more underspecification and thus the model more readily gets stuck in local minima. More data means you can more easily avoid certain local minima. So the question then boils down to the transferability of the learned features on different datasets. Imagenet pretraining generally works well because its a diverse and large scale dataset, which means models trained on it will by default avoid learning a lot of "silly" features.
nibbajenkem t1_ja7d93f wrote
Reply to comment by JJ_00ne in How would you approach this task? by JJ_00ne
What I mean is it doesn't make sense to use deep learning here.