Submitted by shitboots t3_zdkpgb in MachineLearning
modeless t1_iz2bm8r wrote
Reply to comment by new_name_who_dis_ in [R] The Forward-Forward Algorithm: Some Preliminary Investigations [Geoffrey Hinton] by shitboots
Well no one knows exactly what the brain is up to in there, but we don't see enough backwards connections or activation storage to make backprop plausible, so this is a way of learning without backwards connections, and that alone makes it more biologically plausible.
new_name_who_dis_ t1_iz2c6t0 wrote
I’ve heard that hebbian learning is how brains learn and this doesn’t seem like hebbian learning.
However idk if hebbian learning is even how neuroscientists think we learn in contemporary research
whymauri t1_iz38qtl wrote
As of 2019, it is what I was taught in a graduate course on associative memory and emergent dynamics in the brain. We read Hertz's Theory Of Neural Computation. This was right before people worked on Hopfield-Self Attention.
fortunum t1_iz2v4li wrote
Check out E-prop for recurrent spiking NN
Viewing a single comment thread. View all comments