[ANN] Julia programming for Machine Learning

Hello everyone!

I’m happy to announce the Julia programming for Machine Learning course that I got to teach for the first time at TU Berlin this semester:


The course is taught in five weekly sessions of three hours. In each session, two lectures are taught:

Week Lecture Content
1 0 General Information, Installation & Getting Help
1 Basics 1: Types, Control-flow & Multiple Dispatch
2 2 Basics 2: Arrays, Linear Algebra
3 Plotting & DataFrames
3 4 Basics 3: Data structures and custom types
5 Classical Machine Learning
4 6 Automatic Differentiation
7 Deep Learning
5 8 Workflows: Scripts, Experiments & Packages
9 Profiling & Debugging

The first three weeks focus on teaching the fundamentals of the Julia programming language.
These weeks consist of longer lectures, followed up by shorter, “guided tours” of the Julia ecosystem, including plotting, data-frames and classical machine learning algorithms.

Week four is all about Deep Learning: A comprehensive lecture on automatic differentiation (AD) sheds light on differences between Julia’s various AD packages, before giving a brief overview of Flux’s Deep Learning ecosystem.

Finally, week five is all about starting your own Julia project, taking a look at the structure of Julia packages and different workflows for reproducible machine learning research.
This is followed up by a demonstration of Julia’s debugging and profiling utilities.

The lectures and the homework cover the following packages:

Package Lecture Description
LinearAlgebra.jl 2 Linear algebra (standard library)
Plots.jl 3 Plotting & visualizations
DataFrames.jl 3 Working with and processing tabular data
MLJ.jl 5 Classical Machine Learning methods
ChainRules.jl 6 Forward- & reverse-rules for automatic differentiation
Zygote.jl 6 Reverse-mode automatic differentiation
Enzyme.jl 6 Forward- & reverse-mode automatic differentiation
ForwardDiff.jl 6 Forward-mode automatic differentiation
FiniteDiff.jl 6 Finite differences
FiniteDifferences.jl 6 Finite differences
Flux.jl 7 Deep Learning abstractions
MLDatasets.jl 7 Dataset loader
PkgTemplates.jl 8 Package template
DrWatson.jl 8 Workflow for scientific projects
Debugger.jl 9 Debugger
Infiltrator.jl 9 Debugger
ProfileView.jl 9 Profiler
Cthulhu.jl 9 Type inference debugger

Implementation details

Inspired by the Introduction to Computational Thinking course, every lesson and homework is a Pluto notebook. Besides reactivity and automatic homework feedback, a big advantage of this approach is that each Pluto notebook contains a Project.toml and Manifest.toml. This gives students a reproducible Julia environment and allows us to defer the topic of environments to lesson 8.

Using PlutoSliderServer.jl, notebooks are automatically exported to HTML using GitHub Actions. The website is built using Franklin.jl and currently uses iFrames to include the Pluto notebooks. To make this look more seamless, the website re-uses a lot of Pluto’s CSS. Most of the magic then happens in this GitHub Action for deployment.

All of the code is MIT licensed.

If you are an author / contributor to one of the packages covered in this course, let me know if you would like me to make some changes. Contributions are most welcome!



That’s awesome! Combining Pluto with Franklin looks very cool and I wanna do it myself, maybe @fonsp can help figure out a way to make it easier for new users (besides copying the CI yaml)?


There already is static-export-template, which will create a minimalistic index page that links to all notebooks. This is probably the easiest approach.

Using iFrames gives you the advantage of running the embedded Pluto notebooks on Binder. The downside is that clicking on a link inside a notebook will also open it in the iFrame.

An alternative approach is to use PlutoStaticHTML.jl to convert notebooks to static HTML content.


Have you tried using slider server to make any of the notebooks runnable on a website for the interactive parts only? I was wondering which services would be the cheapest or easiest to do that.