Viewing a single comment thread. View all comments

kourouklides t1_j05bmni wrote

In my view, this sounds very boring. It would've been revolutionary if he came up with a new Gradiet-Free Deep Learning method in order to completely get rid of gradients. With very few exceptions, during the last 10 years or so, we keep seeing small and incremental changes in ML, but no breakthroughs.

2

Abhijithvega t1_j0esd7x wrote

Transformers? PINNs? Skip connections, adam, hell even RNNs happened less than 10 years ago.

2

kourouklides t1_j0jyi5c wrote

  1. A simple google search would've revealed to you the following: "The concept of RNN was brought up in 1986. And the famous LSTM architecture was invented in 1997." Hence, not even close.
  2. Didn't I specify that "With very few exceptions?" You merely mentioned those exceptions.
  3. Do you realize that in order to attempt to challenge someone's argument you need to specify two quantities in comparison? What specific decade are you comparing it with?
1