Viewing a single comment thread. View all comments

happy_guy_2015 t1_iqwnq5x wrote

In deep learning, neurons are not represented as a linear function. The output of a neuron is implemented by taking a linear combination of the inputs and then feeding that into a non-linear function, e.g. ReLU. The non-linearity is critical, because without it, you can't approximate non-linear functions well, even with deep networks.

64

nemoknows t1_iqwq1aw wrote

Also, a linear transform of a linear transform is just a linear transform. You need those activation functions in between your layers, otherwise multiple layers is pointless.

42