Micrograd.jl - a port of Andrej Karpathy's Micrograd to Julia

Andrej Karpathy has a great walkthrough of building a scalar reverse mode autodiff library and minimal neural network (GitHub - karpathy/micrograd: A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API). This is a port to Julia with zero dependencies:

The meat of it is ~150 + ~170 lines of code for the autodiff and nn components, so it’s highly explorable. I don’t have an excellent video like Andrej’s to go with it but there are two minimal juptyer notebooks.

I made this mostly for myself and teaching purposes locally, but thought I’d share in case anyone else may find this useful.

Things it’s not:

  • performant
  • the best julian style code
7 Likes