Comments

You must log in or register to comment.

betelgeuse3e08 t1_iv4xedo wrote

Recently there has been growing interest in developing better deep neural network based dynamics models for physical systems, through better inductive biases. Here are some papers that utilize the structure of Lagrangian / Hamiltonian mechanics to learn better dynamics models,

  • Deep Lagrangian Networks (DeLaN)
  • Hamiltonian neural networks
  • DeLaN for energy control
  • Symplectic ode-net (Symoden)
  • Dissipative symoden
  • Lagrangian neural networks
  • Simplifying hamiltonian and lagrangian neural networks via explicit constraints
  • Extending lagrangian and hamiltonian neural networks with differentiable contact models

The following survey paper nicely summarizes some of the work in this area,

  • Benchmarking energy-conserving neural networks for learning dynamics from data.
24

ShadowKnightPro OP t1_iv5aqta wrote

Thank you for such an insightful comment!

However, I'm looking for research that borrows the idea of Physics to solve AI tasks(CV, NLP, ...), kinda like the Poisson flow generative model above. Do you know any papers?

5

betelgeuse3e08 t1_iv5g9h6 wrote

I have been using these physics-informed dynamics models in controls / RL.

From a CV / NLP perspective, I'm not particularly sure. There was some work from Deepmind on learning latent dynamics from images. Check out "Benchmarking models for learning latent dynamics". However, I'm not sure if this is something you'd be interested in.

6

canbooo t1_iv6j73z wrote

I think the comment above you is gold and you are approaching this kinda wrong if this is about research. The fact that they are not (yet) solving cv/nlp tasks is an advantage rather than a disadvantage. Although I must admit, I see a more direct relation to RL than anything, this makes it even more interesting since any idea you will come up with will probably be novel.

6

ShadowKnightPro OP t1_iv94h1t wrote

I totally agree with you, but I'm still in undergrad (just submitted one paper about multi-modal) and I have to prepare for grad school, aiming for top uni. Thus, I suppose publishing more on trendy subfields would benefit more (Maybe I'm wrong on this). Probably I will work on them in my Ph.D. Thanks for your advice!

4

canbooo t1_iv9jjg9 wrote

Ok you are right, I was assuming you are already doing your PhD. In this case, I would keep it simple and focus on methodology rather than novelty. Good luck with your search and thesis.

3

patrickkidger t1_iv5vb05 wrote

Neural differential equations! The continuous-time limit of a lot of deep learning models can be thought of as a differential equation with a neural network as its vector field.

A survey is On Neural Differential Equations.

Also +1 for /u/betelgeuse3e08's recommendations, which are primarily neural ODEs encoding particular kinds of physical structure; c.f. Section 2.2.2 of the above.

You can find a lot of code examples of neural ODEs/SDEs/etc. in JAX in the Diffrax documentation.

This topic is kind of my thing :) DM me if you end up going down this route, I can try to point you at the open problems.

14

Gaussianperson t1_iv9bt1z wrote

Hey Patrick! Huge fan of your research :). It’s a really cool topic imho. Could you share the open problems?

1

patrickkidger t1_ivb3t7e wrote

See the conclusion of my thesis (linked above ;) )

TL;DR: everything neural PDEs, stable training of neural SDEs, applications of neural ODEs to ~all of science~, adaptive/implicit/rough numerical SDEs (although that one's very specialised), there's current work connecting NDEs with state space models (S4D, MEGA, etc.), ... etc. etc!

1

vwings t1_iv4z8sm wrote

Mass-conserving LSTM

5

metatron7471 t1_iv50rlx wrote

Geometric deep learning see michael bronstein and equivariant cnn see max welling

5

jloverich t1_iv4rm8j wrote

Poisson flow generative models

4

Top-Avocado-2564 t1_iv9ahef wrote

Physics informed neural networks is huge area of research in applied engineering, quantum chemistry and physics.

Three major schools of approach are

  1. Function approximation - originating from lagaris et al
  2. Operator learning - karniadikis (deepOnets) and Caltech ( Fourier neural operator l) FNO is getting more adverts due to Nvidia trying to make it 'the' model
  3. Graph neural network approach - Battaglia et al .. this is primarily used for studying problems framed as large scale interactive systems of X where X is particles , objects

We do active work in this space

3

Extra_Intro_Version t1_iv5os5k wrote

This is interesting. My background is in Mechanical Engineering (weakly solid mechanics) and I’ve been semi on the lookout for a connection between mechanics and neural networks. This is kind of a missing link I’m going to investigate further.

2