I was reading the Developer Documentation on Eval of Julia code and the 10,000 foot view of the whole process. I wondered if this is still up to date since there were speed ups in 1.9.3 ?
I was also wondering if, Hypothetically, it would be possible to write LLVM in Julia?
There are new compilers popping up. Like Tilde (project still in its infancy)that is a backend for Cuik C compiler. Odin-lang will also be using Tilde as their backend in future. Nevertheless hard and ambitious to replace LLVM.
On the one hand, can the Julia language express a compiler? Yes, certainly, it’s Turing complete so it can express anything. On the other hand does it make sense to try? Not at the moment, LLVM is doing its job so the best place to spend effort is on making Julia able to do things we all want like threaded GC and latency reduction / cached code and compiling to a small binary for some people’s uses, and such.
yes, but I believe better to bypass LLVM (for non-default Julia [mode]) as Zig did:
Debug compilation is 5 times faster with Zig’s x86 Backend selected by default
Using the self-hosted x86 backend also means you are not subject to the effects of upstream LLVM bugs, of which we are currently tracking over 60. In fact, the self-hosted x86 backend already passes a larger subset of our “behavior test suite” than the LLVM backend does (1984/2008 vs 1977/2008). In other words, this backend provides a more complete and correct implementation of the Zig language.
Zig used to default to LLVM, and still does in some cases like on Windows.
Note LLVM is Julia’s default (huge) compiler backend. You can see lowered code, then LLVM bitcode with @code_llvm, and in the end native code with @code_native. In many cases you wouldn’t need to see or use the LLVM bitcode, if code is already precompiled you would get most of its befefits, but you still need to bundle the huge LLVM with Julia, though not if I recall when using juliac, but having a smaller compiler backend could then help. Some code used LLVM intrinsics, directly or indirectly, and you would still miss out on that.