This project implements a simple reverse-mode automatic differentiation (autograd) engine from scratch in Python. It’s inspired by Andrej Karpathy’s micrograd, and is designed to help understand the inner workings of neural networks, backpropagation, and gradient-based optimization — all without using external machine learning libraries.
- Scalar
Valueclass with automatic gradient tracking - Operator overloading for math operations (
+,-,*,**, etc.) - Backpropagation via
.backward()method - Intuitive and readable implementation using object-oriented Python
- (Optional) Neural network example using the custom autograd engine
Autograd.ipynb: A step-by-step Jupyter Notebook explaining and implementing the autograd engine.
- Reverse-mode automatic differentiation
- Computational graphs
- Chain rule & gradient propagation
- Object-oriented programming
- Basics of neural networks
-
Clone the repo:
git clone https://github.com/RayanBatada/autograd-from-scratch.git cd autograd-from-scratch -
Launch Jupyter:
jupyter notebook Autograd.ipynb