Viewing a single comment thread. View all comments

bohreffect t1_iz3gn0s wrote

You're sleeping on differentiable programming then

2

IDe- t1_iz6z4y3 wrote

The issue is that requiring a model to be differentiable puts far too many limitations on the types of models you can formulate. Much of the research in the last few decades has focused on how to deal with issues caused purely because of the artificial constraint of differentiability. It's purely "local optimization" in the space of potential models, when what we really should be doing is "basin-hopping".

2

bohreffect t1_iz74sa2 wrote

But to imply backprop is getting old neglects all of the real world applications that haven't been pushed yet.

I understand there are problems where differentiability is an intractable assumption but saying "oh old thing how gauche" isn't particularly constructive.

1

IDe- t1_iz77rsw wrote

Ah, I didn't intend to say that it's old or useless, just that I think it receives disproportionate research focus/effort.

2

[deleted] t1_iz6e54k wrote

"differentiable"

1

bohreffect t1_iz6emfb wrote

I mean, can you not compute the Jacobian of a constrained optimization program and stack that into any differentiable composition of functions?

People snoozin'.

1

[deleted] t1_iz6hlao wrote

no you can't because it's not actually a Jacobian

1

bohreffect t1_iz6j2xr wrote

The Jacobian of the solution of a constrained optimization program with respect to its parameters, but I thought that was understood amongst the towering intellect of neural network afficiandos, e.g. the original commenter finding backprop to be stale.

Here's the stochastic programming version: Section 3.3. https://proceedings.neurips.cc/paper/2017/file/3fc2c60b5782f641f76bcefc39fb2392-Paper.pdf

1

Ulfgardleo t1_iz9fjio wrote

Funny that stuff always comes back. We used to differentiate SVM solutions wrt kernel parameters like that back in the day.

1