Julia LLVM projects?

I just finished up an LLVM contract porting to a new processor architecture, I’ve got a bit of free time and was wondering where I might find a list of LLVM-related items/projects that need attention in Julia. I looked at the github projects, but didn’t see anything in that category. I’d primarily be interested in backend as opposed to frontend projects mainly because that’s where my experience is, but I’d be open to either.



Most of the LLVM-related stuff is in Julia itself, e.g. you can search through the Julia issues for LLVM topics such as:

The main LLVM-related packages are probably Cxx (https://github.com/Keno/Cxx.jl), Gallium (https://github.com/Keno/Gallium.jl), and maybe some of the GPU packages?

1 Like

People frequently express interest in generating webassembly via LLVM. Perhaps that’s another possible project for someone with existing LLVM expertise?


There are also more LLVM specific issues that affects julia. Those require less knowledge of julia so it might be easier for you to get started. Unfortunately, those aren’t always searchable on our issue tracker (we currently don’t always keep an issue open for upstream issue).

Here’s a list of LLVM backend bugs that affect us.

X86 partial register stall:
(In general)
AVX only (doesn’t happen with SSE) (There doesn’t seem to be a LLVM bug report for this)

Code generation bug on X86:
We have to work around it. There are better workaround that we might want to implement regardless but it would be nice if we don’t need to worry about this…

Performance issues that we have to workaround on non-X86

Yet another limitation on ARM and PPC64


Also, note that I just listed all the LLVM bugs affecting us that I know and realized that almost all of them are backend issues… It’s probably not surprising since we deal with the LLVM IR more and it’s much easier to deal with (we don’t expose machine IR in julia and we don’t control machine IR passes at all).

Should some of these be added to the GSoC list? I think we should start readying it for another summer. (Or maybe some of these are too difficult? I don’t know this stuff so I’ll defer to your judgement)

There is also LLVM.jl (created to support CUDAnative.jl) which wraps the C API in an idiomatic, “julian” way. It’s pretty incomplete, only serving my needs for now. Not really LLVM development, but I figured I mentioned it anyway.

1 Like

Not sure if it’s relevant here, but there is also a version conflict when multiple versions of LLVM get loaded dynamically.

I notice the first one was reported in LLVM 3.7. What version of LLVM will Julia use for 0.6? I ask in case this has been fixed in a later version - I think 4.0 is soon to be released.

There’s a webassembly target in the current master branch of LLVM: https://github.com/llvm-mirror/llvm/tree/master/lib/Target/WebAssembly

I’d guess this is slated for the LLVM 4.0 release.

What would the usage look like in the Julia context? Command line argument to julia? (ie. julia -emit-webassembly)

1 Like


Someone seems to have recently started working on it. Not sure if the work covers the avx only case though.

And now, since a few days, WebAssembly has reached the “WebAssembly Consensus status”

Majors supporters say:

  • binary version is frozen to 0x1
  • no more feature (or always be backward compatible)
  • clean up docs and w3c wg charter
  • not a preview version anymore. deploy, deploy, deploy


Effectively, there is already a webassembly codegen backend at llvm

I personnaly will be very pleased to forget asap javascript in profit of julia on the web client side :slight_smile:


The way to approach this is to port Julia’s dependencies to WebAssembly – or replace them somehow. Getting LLVM to compile itself to wasm is the very first step. This may already work; if not it’s a very natural next move for the wasm backend. Then one needs to figure out what to do about dependencies like BLAS which have native assembly code in them. Once all dependencies are ported, replaced or removed, compiling Julia itself should be straightforward.


If linear algebra is moved out to “default packages”, then couldn’t a wasm version just be built to not have the extra dependencies and use the linear algebra fallbacks? It’ll drop a little bit of speed but would be usable. That might be the easiest way to get this done.


Sure, those are possible solutions: writing BLAS routines in Julia is an option; moving linear algebra stuff out of Base Julia is also an option. That’s why I said “figure out what to do about” rather than “port OpenBLAS” – since porting it seems like the hardest path.

1 Like

I spend some times last weeks trying to understand the structure of the codebase.
i believe it could help a lot to harden the modularity in this kind of situation.

here is a draft of the remap i have sketched :

./src                                   <- julialang/julia-runtime
./doc/devdocs                           <- julialang/julia-runtime-doc
./base                                  <- julialang/julia-stdlib
./doc/stdlib                            <- julialang/julia-stdlib-doc
./test                                  <- julialang/julia-stdlib-test ?
./doc                                   <- julialang/julia-doc
./src/support                           <- julialang/julia-libcshim
./src/flisp                             <- julialang/femtolisp
./deps                                  <- julialang/3rd
./contrib                               <- julialang/sandbox
./examples                              <- julialang/examples
./etc                                   <- ?
./ui                                    <- ?

This help to spot well performing vs weak assembly.
A lot of good parts appears there. 3 weakness imho: lack of femtolisp test, runtime test, femtolist doc

We do not need absolutely to move on to git submodule yet. All these kinda of work could be simultate with proper subdirectories discipline. Next point should be to reflect the change in the make file.
We could later decide more simply where to plug bias

I’m still a newbie with the julia codebase and my viewpoint could be a rough approximation of reality for now.

A hot project for Julia/LLVM would be to manage to run TensorFlow aot (ahead-of-time) compiled XLA code. This came out with version 1.0 of TensorFlow (a few weeks ago) and according to description allows compiling computation graphs into standalone code using LLVM. This opens up the possibility of integrating TF more tightly with Julia.

Apparantly TF is turning slowly into a Julia clone ;), but TF does have a h-u-g-e dev base which would allow Julia to ride along the Deep- hype.

I just came across this tweet by @viralbshah

I look forward to #ethereum contracts in #julialang then! https://t.co/J916x8xX0G

— Viral B. Shah (@Viral_B_Shah) July 29, 2016

It would be very cool to have a kind of “ETHnative” package to write Ethereum smart contracts in Julia.

I am no expert, but I believe the inspection and meta-programming capabilities of Julia would be very useful in that context, much like the CUDAnative package for CUDA kernels.

I would guess that unfortunately standard Julia code will generally compile into resulting native code that is too large to be useful for Ethereum, from my limited understanding of Ethereum. That would mean that, like CUDAnative, only a subset of Julia might be practical.

Yes, and I believe that would be fine.