Viewing a single comment thread. View all comments

lightofaman t1_j3dccb2 wrote

PhD candidate on AI here. Gradient boosting is the real deal when tabular data is concerned (for both regression and classification on ML). However, thx to UAT neural nets are awesome approximators to really complex functions and therefore are the way to go for complex tasks, like the ones presented by scientific machine learning, for example. LeCun (not so) recently said that deep learning is dead and differentiable programing (another way to describe SciML) is the new kid in the block.

5