Submitted by gbfar t3_106ixxx in MachineLearning

I'm looking for a comprehensive reference on the history of neural networks that covers all significant papers in the field, from the early days up to the current deep learning era, and provides information on their main contributions and inspirations. It would be helpful to have information on how the understanding and perspectives of the research community on neural networks have evolved over time as well.

Do you know of any good references like that?

41

Comments

You must log in or register to comment.

kanripper t1_j3gptt8 wrote

ChatGPT will probably do a good summary

−18

redflexer t1_j3h0fkx wrote

While this is certainly a very good and comprehensive article, be aware that it is a piece in an ongoing conflict between several ML pioneers in their struggle of controlling their legacy in the narrative of the field.

34

josep-panadero t1_j3h4m2w wrote

At this point ChatGPT seems to be quite good at getting answers to this kind of questions. What it does not provide is a personal assessment like "this book is inspiring" or "it was fun to read".

ChatGPT:

>There isn't a single reference that covers the entire history of neural networks in great detail, but there are a number of good books that you might find helpful. Here are a few that you might want to check out:
>
>"Deep Learning" by Ian Goodfellow, Yoshua Bengio, and Aaron Courville: This book provides a comprehensive introduction to deep learning, including a historical overview of the field. It covers the key ideas that have driven the development of neural networks from the early days up to the current deep learning era, and provides information on their main contributions and inspirations.
>
>"Neural Networks: A Comprehensive Foundation" by Simon S. Haykin: This book is considered a classic reference in the field of neural networks. It provides a detailed historical overview of the field, starting with the early perceptron models and moving through the development of more sophisticated architectures such as convolutional neural networks and recurrent neural networks.
>
>"The Deep Learning Revolution" by Terrence J. Sejnowski: This book provides a broad overview of the history of neural networks and deep learning, and discusses how these technologies have been applied to a wide range of applications, including image and speech recognition, natural language processing, and more. It also provides a glimpse into the future of deep learning and how it is likely to continue to impact a variety of fields.
>
>I hope these references are helpful! Let me know if you have any other questions.

So, I asked: Is "The Deep Learning Revolution" fun to read?

ChatGPT:

>"The Deep Learning Revolution" by Terrence J. Sejnowski is an engaging and informative read for anyone interested in the history and future of deep learning. It provides a broad overview of the field and discusses how deep learning has been applied to a wide range of applications, including image and speech recognition, natural language processing, and more. Sejnowski writes in an accessible style and does a good job of explaining complex concepts in a way that is easy to understand. Overall, I think most people would find it an enjoyable and educational read.

I still think that ChatGPT does not have a proper opinion, as it could think that a book is easy to read but be inapprehensible for a human. But the answer may be useful nonetheless. It, at least, can be used as a starting point to further investigate.

ChatGPT can be used to find answers just like Google is used to find an answer in StackOverflow.

I foresee that new rules may be needed in Reddit to what to do with ChatGPT copy/paste, like Stackoverflow already did.

−7

gbfar OP t1_j3hbe6a wrote

Thanks for the suggestion, this is a good reference. I've actually taken a look at it already and noticed that it doesn't delve much into the early developments (there's barely any mention of McCulloch & Pitts's model). Do you know of any other references that go into more detail on each paper, like in "Evolution of Artificial Neural Networks" by Averkin and Yarushev?

4

peter201943 t1_j3hfad0 wrote

I'll make a counter argument. If you are going to compare ChatGPT to Google, then why do you feel it is appropriate to directly paste the results of ChatGPT on Reddit?

I ask because Google can have factually incorrect search results that must then be evaluated by whoever is performing the search. By posting the raw output, without applying any of your own critical perception onto the suggestions, you are merely dilluting the level of intelligence in the conversation.

So on Reddit, a forum for humans, I might remind you, why would we post the results of an automatic query of whatever kind, that then requires effort to determine if the results of the query are actually useable themselves?

I think it's ok for you or anyone else to use ChatGPT to inform your answer, the same way its ok to use Google or Wikipedia. And yes, mentioning that your answer came from ChatGPT is nicer than just pasting from GPT directly.

Another suggestion, if you have read enough ChatGPT responses, you'll notice a pattern in them, that they are full of filler or have a hard time just getting to the point. This filler is not desirable for a quick response (such as on a forum, or in this case, getting a list of references). Compare the length of your comment to the other comments here. Do you see how much longer your comment is?

Lastly, I know that there are a lot of easily Google-able questions on Reddit, but for something subjective, such as this forum thread in particular, then the kind of information being sought is not objective or measurable. The OP is asking for opinions. Let's assume they've already Googled "Books on History of Artificial Neural Networks". Since ChatGPT is trained on the same dataset as what Google accesses (the Internet), there is no benefit in copy-pasting the output of ChatGPT. It does not have novel opinions, it simply aggregates known existing ones.

I look forward to your introspection.

4

gbfar OP t1_j3hjdpa wrote

Complementing your answer...

I've actually used ChatGPT before posting this thread, and the answers were all unsatisfying, just like in the comment above yours. Actually looking into the resources recommended by ChatGPT will quickly show that most of them simply do not meet the criteria I specified in my post. The only useful reference is the book by Haykin, but it looks like the historical references are somewhat scattered throughout the book, which makes it not so easy to read.

Also, I've not come here after just opinions. I've come here hoping for informed opinions from experienced researchers, who may have already read and evaluated many of the references on NN history that I'm asking for. This is something ChatGPT is very far from being able to provide.

5

MrEloi t1_j3hku55 wrote

Going all the way back to the Perceptron .. or maybe even earlier?

You'll be very, very busy!

12

aigyfkkq t1_j3hlxmt wrote

Talking Nets from MIT press has some nice interviews with pre-90s neural net researchers. Also iirc there’s some commentary on those early papers in the Neurocomputing 1 and 2 collections, also from MIT

7

junetwentyfirst2020 t1_j3hmb38 wrote

I wrote find a masters thesis from a notable deep learning person because they will have laid out the related work since the “beginning of time“, and it will be organized and each sub group will be related

1

clayhead_ai t1_j3htcfd wrote

This is such a fascinating book! Especially the parts about Walter Pitts. He was a genius from a young age, IIRC he was sending letters to Bertrand Russell correcting his proofs when he was just a teenager. Very tragic story though. Severe mental illness kept him from having the career he deserved. Someone should make a movie about him.

3

ml-research t1_j3l1fg3 wrote

Probably look for something Jürgen Schmidhuber wrote or presented.

2

pm_me_your_pay_slips t1_j3loz71 wrote

In the beginning God created the heaven and the earth. And the earth was without form, and void; and darkness was upon the face of the deep. And the Spirit of God moved upon the face of the waters. And God said, Let there be light: and there was light....

And God said, Let us make man in our image, after our likeness: and let them have dominion over the fish of the sea, and over the fowl of the air, and over the cattle, and over all the earth, and over every creeping thing that creepeth upon the earth. So God created man in his own image, in the image of God created he him; male and female created he them.

And Jürgen Schmidhuber chastised God for failing to cite his papers since his creation of man and woman are special cases of Artificial Curiosity and Predictability Minimzation.

8

viv1a t1_j3yfwu4 wrote

You can find it on amazon for cheap: https://www.amazon.com/Neurocomputing-Foundations-Research-James-Anderson/dp/0262510480

I second that it's a great book! it covers stuff until the late 80s and has very nice commentary on various foundational papers until then (McCullough and Pitts, Hebb, the Perceptron, Adaline, Neocognitron as well as Hopfield's works). The earliest paper it includes is actually from 1890 (!) and is by the psychologist William James who framed the mind as a kind of input-output machine.

There is a version out there with a cool cover depicting a neuron on a circuit board.

1

Mysterious_Tekro t1_j42u5vx wrote

There is a Wikipedia about the timeline or chronology And the milestones if you look up those words you ll find stuff... Wikipedia gives a very long list of all the major challenges achievements that have been achieved since the 1990 S

1