Viewing a single comment thread. View all comments

Oceanboi t1_j8uygkt wrote

why was the neural network stopped at like 1000 steps? why are we comparing a physics informed neural network to a neural network at a different number of steps lol

Also correct me if I'm wrong but don't we care about how the model generalizes? I think we can show that some NN will fit to any training set perfectly given enough steps, but this is already common knowledge no?

1

crimson1206 t1_j8w4qnt wrote

The steps really don’t matter. The normal NN will not learn to extrapolate better with more steps.

This post is precisely showing how the PINN has better generalization than the normal NN

2

Oceanboi t1_j8zdely wrote

Oh I see, I missed the major point that the training data is basically incomplete to model the entire relationship.

Why embed priors into neural networks, doesn’t Bayesian Modeling using MCMC do pretty much what this is attempting to do? We did something similar to this in one of my courses although we didn’t get to spend enough time on it so forgive me if my questions are stupid. I also would need someone to walk me through a motivating example for a PINN because I’d just get lost in generalities otherwise. I get the example, but am failing to see the larger use case.

1