Submitted by pranftw t3_y4giax in MachineLearning
Hey everyone!
I released v0.0.2 of neograd, a deep learning framework created from scratch using Python and NumPy, with automatic differentiation capabilities. Iād taken for granted that I understood how convolutions work. Just implement a sliding window, perform element-wise multiplication, take its sum, sounds so simple right? Add to that - accounting for the running time of the algorithm, backward pass to get its gradients and convolutions over volumes, this turned out to be an excruciating undertaking.
This release includes:- Gradient checking to check the correctness of gradients that are calculated by autograd- Optimization algorithms like Momentum, RMSProp, and Adam- 2D, 3D Convolution and 2D, 3D Pooling layers for Convolutional Neural Networks- Save trained models, weights to disk and load them whenever required- Add checkpoints while training the model- Documentation hosted at https://neograd.readthedocs.io
Checkout the GitHub repo - https://github.com/pranftw/neograd
Explore the new features on Google Colab - https://colab.research.google.com/drive/1D4JgBwKgnNQ8Q5DpninB6rdFUidRbjwM?usp=sharing
https://colab.research.google.com/drive/184916aB5alIyM_xCa0qWnZAL35fDa43L?usp=sharing
​
​
loukitkhemka t1_ise31zs wrote
That is awesome. It is always rewarding to implement something from scratch.