For classic ML and DNN, scikitlearn, tensorflow and pytorch are popular - will be so in the foreseeablefuture. For genetic algorithms, GNNs and LLMs also, Python - C/C++ ecosyem is at the forefront. I would like to understand what domains of AI and other Digital Technologies that Julia is leading on or trying to emerge as the leader. The reason for the question is to what to use Julia for.
As usual, this is relative. We personally do use the deep learning Julia packages extensively (instead of torch), and that has to do with integration considerations.
I’m a huge fan of the packages related to numerical simulation of differential equations. They’re the main reason that made us switch most of our code to Julia (previously we were using Python for this stuff).
I also like the auto-differentiation experience despite the initial hurdles one has to go through to learn how to use some of the related packages. This made me able to solve a lot of seemingly brutal (analytically) optimization problems with great computational performance.
Nobody can answer certainly if and how large of a domain Julia might lead one day. Looking at the past, one could state that Julia has a strong standing in the scientific computing community and continues to gain traction in this domain.
To get a more quantitative view for the use cases of Julia, you might be interested in last year’s survey. That could help you to gauge the potential of Julia for your specific projects. From page 29 onwards, you can see what Julia is used for: 2023-julia-user-developer-survey.pdf (julialang.org)
Some other good resources here include:
“should” is an interesting word here - I’m programming microcontrollers (or at least trying to) with Julia. Should I? I don’t know, that’s what I’m trying to figure out!
So to me, there is no “should” - I just try, and if it works, great, if it doesn’t, I try to learn why.
For the question of “What does Julia do better than the rest?”, I often talk about SciML.
I would also say that “Julia” is not trying to do anything. The Julia developers are trying to make the best language they can according to what they and the users want/need. Beyond that, it’s just people discovering Julia, enjoying the language, and using it in their domain of expertise.
I don’t think it needs to be any more than that.
This question — especially with the normative “should” — is definitely one of those sorts of threads that could spur numerous hot-takes and endless discussion. I’m going to slightly adjust the title to the descriptive “What is Julia used for?” and set a close timer.
JuMP puts Julia at the top for mathematical optimization – heavily used in the energy industry https://jump.dev/.
A post was merged into an existing topic: Time limits for unfocused discourse threads?
If you take a look at the frameworks you mentioned, they are composed of components in many distinct languages. For example, below is the composition of SciPy, which serves as the base for scikitlearn.
Tensorflow and PyTorch are each about half C++ and half Python. This can create challenging hurtles for development and maintainence.
If you take a look at FluxML.jl for example, you will see it is written entirely in Julia. This is true of many Julia packages.
I use Julia when it makes sense to have everything in one coherent language rather than many. In my work, this occurs when I’m doing research or scientific computing. That is I am trying to do tasks that have not been clearly done before. That may include classic ML, DNNs, GNNs, or LLMs.
I’m one of the developers of the data visualization framework Makie.jl and for such a package, Julia’s mix of dynamic nature plus compilation of fast code is great. Just an example, in CairoMakie when we want to draw a scatter plot, we first convert the color information we get from the user to a common format, usually a vector of
RGBA values Makie.jl/CairoMakie/src/primitives.jl at master · MakieOrg/Makie.jl · GitHub. For this, you need dynamic code because these colors could really be anything,
RGBf(1, 0, 0), any
AbstractVector of any colors, etc. Makes it very flexible for the user.
But after conversion, when we have a nice RGBA vector, we can broadcast over everything in a compiled loop Makie.jl/CairoMakie/src/primitives.jl at master · MakieOrg/Makie.jl · GitHub and call Cairo’s C routines with super low overhead.
We can move freely in this dynamic/static continuum, wherever we need more performance, we move further to static. Wherever we need more freedom or know less about what the user might do, we make things more dynamic.
The only annoying thing is the latency, but at least for users this has gotten much better with pkg images. Not so much for us developers of course
For making programming more enjoyable.
Python became so popular for ML and other scientific applications because back in 200x google started to use it for data analysis, and this created a network effect. Everybody wants to use the same tools as google, although python is clearly not the best tool for scientific domain and was not created with performance in mind.
There is nuance here that those graphs don’t account for dependencies or base languages being at least partially written in other languages, nor do they reflect how responsible the languages are for runtime performance. For example, StableDiffusion is 100% Python but depends on PyTorch which isn’t, and the heavy lifting definitely isn’t written in base Python. Regardless, it is true that a codebase in 1 language is a perk, and that is easier and smoother because performant internals can be written in base Julia.
Thank you all for your comments. It has been helpful and I have made notes.
Also JAX. Julia has native solutions competing with JAX (and PyTorch). At one point the TensorFlow.jl Julia API was maintained, but people now use Flux or Lux or other. People do classic ML, e.g. clustering in native Julia, but also some with a scikirlearn wrapper.
Julia is far ahead with SciML, scientific machine learning (one of many shining examples of Julia, also open source code built on it, and proprietary/apps and companies, e.g. PumasAI), but it’s likely fair to say playing catch-up with some other areas, e.g. LLMs. But you can reuse all such as is.
Zipline “drone tech/medical supplies/blood delivery company” one of my favorite examples from here:
See Julia case studies on it and others at juliahub.com
JAX is nice, and it’s annoying Julia / Flux.jl did not get the corporate support required, as we could be talking about a very different scenario here, with Julia being a major contender.
To add up to the discussion, I would say Julia is quite popular in the Bayesian inference side of things. Message passing, variational inference, probabilistic programming, etc.
This topic was automatically closed after 2 days. New replies are no longer allowed.