Submitted by LahmacunBear t3_106y5sy in MachineLearning

I've written a small piece of research on an idea of mine, a new optimizer which has an adaptive global learning-rate, based off Adam and uses (what I think) is a neat trick to get the calculus to work.

EDIT: Didn’t do my research, seems like someone came up with a similar idea before. I do use a different implementation though. Will change the document to read as such. Everything else on the post applies still!

My goal in putting it here is mainly to ask for opinions and directions; to clarify, I've not received any professional/formal education in Machine Learning, and my studies in it are purely my own, and I'm not connected to any circles which could help me. What I've done is taken some simple concepts and mimicked what I've seen in papers I've read.

I think (hope) that I'm solid in the math and code and concepts of AI, but clueless about the real-world stuff around it. This is me asking about what that other stuff, first steps into this field publicy, is like.

Any advice would be much appreciated.

Many thanks. A PDF is availabe here.

15

Comments

You must log in or register to comment.