You absolutely need calc 3 and linear algebra for AI. Backdrop is nothing but partial derivatives + ordered bookkeeping. And matrix math is the computational heart of neural networks.
They're the same. Calc 3 should be considered as a prerequisite for anything involving neural nets. Trying to understand their behavior without it is like trying to go into physics without learning calc.
Could be similar. Multivariable I think is a precursor to vector.
For us, we learned vector calculus as needed, such as in electromagnetics.
Calc 1 was derivatives and integrals and related. Calc 2 was derivatives and integrals of transcendant functions (exponent/log, trig functions etc) and time series. Calc 3 was a little bit of all of those things but in multiple dimensions, so partial derivatives, double integrals, and probably some vector stuff.
What we did for the vector calc part of emag was anything required there, up through some of the easier boundary value problems methods. But calc 3 didn't include like, divergence and curl, surface integrals, greens or stokes theorem, etc, where these are needed for specific emag problems.
trendymoniker t1_ivf3lp5 wrote
You absolutely need calc 3 and linear algebra for AI. Backdrop is nothing but partial derivatives + ordered bookkeeping. And matrix math is the computational heart of neural networks.