similar to this thread, I took the presentation by Yakir and did some modifications to talk more about things that are relevant for economics PhD students. The presentation took one hour and there was some interest.
You can find my markdown files etc. here and the presentation itself here.
I think it is AOT, not JIT. But I would not worry too much about this and drop JIT too, just say that the type system is designed so that generic code can be compiled specialized to concrete types.
Handling NaN correctly is not specific to Julia, but the IEEE 754 floating point standard which pretty much everyone uses.
Overhead to foreign calls is not necessarily zero. It is small in the ideal case.
Instead of bullet points, I would just work through small example, eg write a toy problem and demonstrate AD on it.
Hi,
it’s a useful (and nice looking) presentation. However, I suggest making the presentation a bit more practical (similar to Tamas_Papp’s 4th point). This may well speak more directly to the students.
I would also suggest adding a few more examples of what is out there in terms of stats/econometrics, and a speed comparison (of a tricky non-linear model) with R wouldn’t hurt.
Thanks for the comments! Indeed, I was thinking about giving a small example, but didn’t have one ready before the presentation. I’ll try to think of something and include it.
I would do maximum likelihood estimation of a simple (and I mean embarrassingly simple) structural model. Knowing you audience will help you select the appropriate one.
present the model,
make up parameters, generate data (this demos Distributions.jl, basic syntax, possibly loops, showing off UTF8 notation, possibly plotting),
code up the (log) likelihood,
AD through the likelihood with ForwardDiff.jl seamlessly (magic!),
AD is automatic differentation. ForwardDiff.jl is robust, while Zygote.jl is new and exciting – much faster for large dimensional inputs, and also solves one of ForwardDiff’s major downsides: that it requires code to have been written in a generic fashion.
Eventually, we’ll have Capstan.jl too. It will probably be the default.
The last major ones that springs to mind are AutoGrad.jl, which is used by Knet.jl, and ReverseDiffSparse which has now been integrated into JuMP.jl. There are other libraries.
I encourage you to give it a try!
Definitely feels like magic (to me).
What is a difference between JIT and AOT (ahead of time compiler?)? Or maybe it is wrong place for discussion on this topic?
JIT is like Java, where a compiler can use runtime information to optimize “hot” code.
Julia’s compilation is more like C++, except Julia’s compiler procrastinates until the first time you call a function with the given combination of argument input types. When you do that, it compiles a version of the function specialized for those argument types in more or less the same way C++ would using LLVM.