At present, in what aspects is Julia still relatively weak compared to other mainstream programming languages?

Posted a similar comment on HN recently.

From my point, there is still too much change around automatic differentiation libraries.

There is Zygote.jl, which is used in Flux.jl. However, it’s more in maintenance mode. At some point, Diffractor.jl was hyped but it didn’t take off yet. And then there is Enzyme.jl which people hype now.

But for me as a user, it’s not clear what I should really do to make my code well differentiable for those libraries. I was hoping that AbstractDifferentiation.jl could fix that but my take-away from that recent discussion was that it also has its issues.

If you stick with torch, jax or tensorflow, everything seems to work better regarding AD.

This is the same pattern I keep observing from people in my community (differentiable forward models in imaging or physics). We work with large 2D or 3D arrays (millions to billions of entries) and try to optimize on them. So it has to run on CUDA and AD is the main feature everyone needs.

12 Likes