Submitted by Emotional-Fox-4285 t3_yoauod in deeplearning
Emotional-Fox-4285 OP t1_ive060v wrote
Reply to comment by elbiot in In my deep NN with 3 layer, . In the second iteration of GD, The activation of Layer 1 and Layer 2 output all 0 due to ReLU as all the input are smaller than 0. And L3 output some value with high floating point which is opposite to first forward_ propagation . Is this how it should work ? by Emotional-Fox-4285
If you don't mind....I could share you my code so you could see
elbiot t1_ivexlrx wrote
No I don't have time for that. Good luck
Viewing a single comment thread. View all comments