Submitted by LahmacunBear t3_106y5sy in MachineLearning

I've written a small piece of research on an idea of mine, a new optimizer which has an adaptive global learning-rate, based off Adam and uses (what I think) is a neat trick to get the calculus to work.

EDIT: Didn’t do my research, seems like someone came up with a similar idea before. I do use a different implementation though. Will change the document to read as such. Everything else on the post applies still!

My goal in putting it here is mainly to ask for opinions and directions; to clarify, I've not received any professional/formal education in Machine Learning, and my studies in it are purely my own, and I'm not connected to any circles which could help me. What I've done is taken some simple concepts and mimicked what I've seen in papers I've read.

I think (hope) that I'm solid in the math and code and concepts of AI, but clueless about the real-world stuff around it. This is me asking about what that other stuff, first steps into this field publicy, is like.

Any advice would be much appreciated.

Many thanks. A PDF is availabe here.

15

Comments

You must log in or register to comment.

resented_ape t1_j3kiocc wrote

Have you looked at other related papers? In particular, hypergradient descent and papers that cite that work (which you can find via google scholar)?

7

LahmacunBear OP t1_j3l5ub2 wrote

Oh damn, that paper almost does exactly what I do. Huh. Oh well. Slightly different implementation though. I in contrast, use both grads from the same timestep and have an accumulated Ct.

5

SatoshiNotMe t1_j3lhi10 wrote

Are either of these open source and easily usable as a PyTorch optimizer ?

1

LahmacunBear OP t1_j3mrexi wrote

Mine’s in Tensorflow 2.11 — I’m sure writing a PyTorch version wouldn’t be hard. The extra lines of the algorithm are three lines in my paper. I can share my code though?

1

SatoshiNotMe t1_j3n5p3v wrote

Thanks! I was just curious for future reference. I’ll need to first read the papers to see if it can help with my projects.

2