back
PROJECT
autograd engine
Built an automatic differentiation engine in pure Rust, inspired by Karpathy's micrograd, supporting forward and reverse-mode backpropagation.
Understanding the deep mathematics and computational graphs underlying modern deep learning frameworks.
Developed a dynamic computational graph from scratch in Rust, implementing core operations (add, mul, relu, tanh) with automatic gradient accumulation.
skills used
- Rust
results
Successfully demonstrated end-to-end learning on XOR and image classification tasks using a custom neural network architecture and SGD optimizer.