Submitted by pranftw t3_y4giax in MachineLearning

Hey everyone!

I released v0.0.2 of neograd, a deep learning framework created from scratch using Python and NumPy, with automatic differentiation capabilities. Iā€™d taken for granted that I understood how convolutions work. Just implement a sliding window, perform element-wise multiplication, take its sum, sounds so simple right? Add to that - accounting for the running time of the algorithm, backward pass to get its gradients and convolutions over volumes, this turned out to be an excruciating undertaking.

This release includes:- Gradient checking to check the correctness of gradients that are calculated by autograd- Optimization algorithms like Momentum, RMSProp, and Adam- 2D, 3D Convolution and 2D, 3D Pooling layers for Convolutional Neural Networks- Save trained models, weights to disk and load them whenever required- Add checkpoints while training the model- Documentation hosted at https://neograd.readthedocs.io

Checkout the GitHub repo - https://github.com/pranftw/neograd

Explore the new features on Google Colab - https://colab.research.google.com/drive/1D4JgBwKgnNQ8Q5DpninB6rdFUidRbjwM?usp=sharing

https://colab.research.google.com/drive/184916aB5alIyM_xCa0qWnZAL35fDa43L?usp=sharing

​

​

https://i.redd.it/w6qufo75ywt91.gif

https://preview.redd.it/nvecir75ywt91.png?width=502&format=png&auto=webp&s=2d0e6fafdc263be39702eba079f056cc18bef1f1

https://preview.redd.it/rmyskv55ywt91.png?width=543&format=png&auto=webp&s=ce955315b5a941da8b0a6967e5e494ee885f1193

#ai #deeplearning #framework #python #numpy #neuralnetworks

36

Comments

You must log in or register to comment.

loukitkhemka t1_ise31zs wrote

That is awesome. It is always rewarding to implement something from scratch.

10

pranftw OP t1_isfahf5 wrote

Thanks a lot! Rewarding and extremely painful because even a tiny mishap would break it entirely!

3

shawarma_bees t1_isg1xpp wrote

Sounds like a great learning experience! Out of curiosity, why are you packaging and making it publicly available for others to use when we already have PyTorch/Tensorflow/caffe/etc?

10

pranftw OP t1_isi8ers wrote

If people want to use and try it out, they will. My duty is to just code and make it easily available!

Checkout the comment by bjourne on this link - https://news.ycombinator.com/item?id=33215834

bjourne 4 hours ago | unvote [ā€“]

Well done! The code is very clean and easy to read. I'll definitely recommend your library to people who want to learn how backprop works.

3

ChebyshevsBeard t1_isg5eso wrote

Fwiw, convolution can be implemented as a matrix multiplication.

5

pranftw OP t1_isg84lk wrote

Oh, I wasn't aware of that. I just implemented the straightforward approach

2