Viewing a single comment thread. View all comments

jostmey t1_j4vmqcs wrote

A deep neural network trained by backpropagation will converge to a local minimum if you use gradient descent or even stochastic gradient descent. However, there are many components added to a deep neural network like dropout and batch normalization, which as far as I know, do not come with convergence guarantees.

There are no guarantees about finding a global minimum

1