Julia using AOT compiler?

I met a guy from London at a conference recently, and he told me that a new language called Pony might been the next C++ in financial industry. He talked about the AOT (Ahead Of Time) compiler used by Pony, and I guess it should be something similar to the JIT compiler. Since I am not a computer science guy, so judging by the name, I guess it should be faster than the JIT compiler.

My question is: Will Julia be able to use this AOT compiler to further improve its speed? Or more generally, as more advanced compilers come into being, will Julia be able to maintain its speed advantage?

1 Like

Please search this forum. This has been discussed at depth already. Look at the page which shows Makie.jl already successfully AOT compiling, static-julia, the old blog posts on this, the threads on this, etc. The short answer is, yes Julia’s design allows this to happen but the people who would work on this have been working on v1.0 instead.

From Pony’s web site “The language doesn’t even have the concept of null!” - I can see how that could make some things easier :joy:

2 Likes

Also, see

Since I am not a computer science guy, so judging by the name, I guess it should be faster than the JIT compiler.

That’s a pretty bad idea :smiley: Julia is unlike most other JIT compiled languages. What AOT can get us is only removing JIT overhead, which occurs for the first function call.
Julia is basically statically compiled at runtime, with the same tools as Clang uses to ahead of time compile C++ (namely LLVM) - so after the first call, the performance of that function is indistinguishable from an AOT compiled function.

As a demonstration in the Julia REPL:

julia> 1+1 # this is at runtime!
2
julia> test(a) = sin(a^19)
test (generic function with 1 method)
julia> @time test(22.0) # at runtime compile function for Float64 - takes quite a bit of time and memory
  0.004459 seconds (1.55 k allocations: 81.745 KiB)
0.09520449192956647

julia> @time test(22.0) # now this calls the basically AOT compiled function
  0.000005 seconds (5 allocations: 176 bytes)
0.09520449192956647

Now you might even say, that test was compiled ahead of time of the second function call :wink:

To get rid of the first slow down, which is entirely due to compilation, one needs to tell Julia which functions to compile “ahead of time”. Getting a binary out of that etc, is still a bit experimental, but works even for large package as I have shown with Makie :slight_smile:

1 Like

Pony is a really interesting language. It targets a very different set of applications than Julia does. Pony is designed to write large, highly concurrent, distributed actor systems. Julia is designed to be both very high performance and easy to use (and thus very productive) when writing numerical code. There are some situations where these applications overlap, but I don’t think you’ll see quants or analysts using something like Pony any time soon. People who already write actor systems in C++, Java, Scala or Erlang are the ones who might be seriously interested in Pony. This recent writeup of why a company used Pony to build a system called Wallaroo is interesting and helps highlight what the language is really great at:

https://blog.wallaroolabs.com/2017/10/why-we-used-pony-to-write-wallaroo/

Carl Bolz (of PyPy fame) once quipped at a dinner we were at that “Julia has an AoT compiler that runs at runtime”, which at first I was confused by, but he explained that the way Julia works is closer to how a fully AoT compiler for C or C++ code works than how JIT compilers for languages like JavaScript or even Java work. In Julia, the compiler generates code just before execution based on runtime types and static code analysis. In “real JITs” on the other hand, the compiler is not invoked until after a significant number of executions of the code to be compiled – that’s how the system knows what the code does. Basically, Julia already generates code that’s as fast and efficient as C or C++ in very much the same way. Because of that, anything that makes C++ go faster will also make Julia go faster. Unlike C++, however, there are even more optimizations Julia could potentially borrow from the traditional JIT world: after-the-fact optimization of already-running code, specialization of code on common run-time values, etc. In short, yes, Julia is quite well positioned to take advantage of advances in either AoT or JIT compiler technologies.

It’s also possible to fully ahead-of-time statically compile Julia code already and there’s ongoing work to make it more convenient to do so.

20 Likes