Compiler work priorities


Sounds like a great solution, actually: make Julia (literally) walk like Python (at the beginning), then run like C :slight_smile:


very good point. It all boils down to the “(C++) template instantiation problem”. Julia is actually in a much much better situation than C++ since it ships with a compiler and can emit the code when it is needed. On the other hand we also don’t have something like type erasure (Java) which can lead to inefficient code generation. In that light, indeed, Julias form of generics is an interesting research question that requires new / more sophisticated JIT.


Is the new multithreading implementation going to come with thread-safe Base/stdlib functions? The main reason why I’m not even using today’s multi-threading is because my program would crash due to the use of I/O calls (what I wanted to run in parallel is to read hundreds of thousands of small files).


Look at C++ and Swift as two examples that took a very different approach to allowing generic programming. C++ took an approach where all possible combinations are compiled in advance so that calls to a shared library will just work on all known types. The issue there is two fold: you have to compile the library again to add to the list, but also the compilation time with templates is huge because combinations increase exponentially (factorially). If you haven’t ran into this yourself, look up the threads discussing how simple changes to templates adds hours and days to compile time.

On the flip side, Swift has great generics, but IIRC when used the generated code boxes everything so that recompilation isn’t necessary for each combination. This fixes compile time, but introduces a runtime cost for any use of generics and it’s not cheap. @sbromberger, the LightGraphs.jl developer, took Swift for a spin a bit and this was his main gripe when coming back to Julia.

That leaves us with Julia which takes a middle approach. Like C++, it will specialize the compiled code, but it only does the combinations you specifically ask for. This is a very good example of how a JIT can get rid of traditional limitations. However, this does mean there can be a bit of compile time. At the same time, you can use @nospecialize on an argument to get Swift like behavior and dramatically reduce the amount of compilation. This is quite underused and packages like Plots.jl should be using it more liberally.

But these examples also suggest other strategies that could be incorporated into Julia. The obvious one is precompilation, essentially doing the C++ thing with some chosen or default set of types. Additionally, we could go the Swift route and have less compilation by default. There is already a flag for the REPL to make it act like @nospecialize on everything, but a subtle intermediate approach could be interesting.

In fact, a tracing JIT version of Julia would be very interesting. Julia functions are already made to compile very easily, so a non-compiling interpreter could just check what you’re calling and go “wow, let’s JIT this one function”. In fact, I see a good use case for all 3 versions.


Thinking of some practical steps for compile time latency, one weird thing with the Julia v1.0 transition is that optimization levels seem almost meaningless now. While SIMD and such used to be -O3, it’s all now -O2 and enabled by default. What is -O3 for now? How much less is -O1 or -O0? You do reduce compile time a bit, but not much. But I think better delineation of the compilation levels is necessary before even trying to do tiered JITing. I get that we want all optimizations on by default to win benchmarks, but it’s really not necessary for normal usage.

(Where are the optimization levels defined anyways? I have no idea where or how that’s encoded. Is that just on the LLVM side, and the change was just a LLVM 6.0 change? That’s probably questions for @Keno )

Allowing explicit tiered optimization of packages before making it automatic/implicit could be a good first step. We already know that packages like Plots.jl should just do -O0 or no compile, so if there was an option to tell the compiler how to act on those function calls then the package authors could make a choice depending on use cases and applications (in a way users could override). I think that could go a very long way towards reducing the amount of compilation. (I have no idea how that would be implemented though…)


AFAIK they’re only used in selecting LLVM passes:


I see. So the only pass enabled by -O3 vs -O2 is aliasing analysis?


When it comes to money: When upgrading Homebrew just now, I got a brief message about how it is maintained by volunteers, and a request for considering to donate, with a link. I followed the link, and they have a section on donation in their README, including a Patreon badge. I’m already a Patreon user, so it took me almost no time to set up a recurring donation. Were I presented with something similar for Julia (either via Patreon, which I’d prefer, or just a recurring payment for PayPal, or something), I’d be happy to do the same. Maybe something to think about?


While Patreon is one option, an alternative would be bounties for PRs. This way, people can know what they’re getting by providing funding, and can help drive difficult PRs to completion. It’s also a tangible way to vote for features that you want to see implemented, like latency improvements or multithreading.


The economics don’t really work out, unfortunately. If 1% of active Julia users gave $50, we might be able to pay for one FTE. And those numbers are pretty generous for crowd-sourced donations. One significant grant or corporate contract easily dwarfs that, which is why that’s generally how these things happen.


Julia does have a way for you to donate monthly. I think this is funneled to Julia via NumFocus.

The economics don’t really work out, unfortunately. If 1% of active Julia users gave $50, we might be able to pay for one FTE. And those numbers are pretty generous for crowd-sourced donations. One significant grant or corporate contract easily dwarfs that, which is why that’s generally how these things happen.

Some people can donate large $$, some can ‘donate’ time to make big PRs, and some can do neither, but would be happy to donate small $ as a way to participate in the community. Small donors can become big donors, or they can turn into big Julia proponents. I like the idea of encouraging small donations.


I don’t mean to discourage donations at all, just explaining why I don’t this is currently a viable path to funding serious compiler work. Please do donate to Julia via NumFocus! It’s very useful for other, less expensive things like paying for CI and other infrastructure for the open source project.


Just realized I already used that mechanism donate to Julia! xD I was just reminded of the discussion of funding when I came across that Patreon badge in Homebrew’s README, which stuck out to me, because I hadn’t seen code projects use Patreon before. But, yeah, not really that useful if a thing to bring up here, I guess :smiley:


Many opensource projects allow bounty for a specific feature. I think this is a very nice thing to do. It can help freelancer developers to “reject” a job in order to improve Julia and still earn something. I think you should at least consider it :slight_smile: I mean, it will make no harm right?


I think the UK lists compiler programmer as one of their skill shortages. I kinda see why


There is bountysource, but other than a few Power-related bounties by IBM, it hasn’t had much success (and even then, I doubt that Valentin was appropriately remunerated for his efforts).


I see. However, was it announced properly? Maybe I am not that active in the community, but I was using Julia since 2013 and this is the very first time I heard about this bounty source… As the language gets popular, it can be used to encourage developers to contribute. Will it be a paid job? Of course not, but sometimes devs can get a reward for their work.

Moreover, the world is quite different. Let’s say we gather 1,000 USD for a feature. In USA, for a young developer, this can worth maybe 1 week of work? However, in developing countries, this can sometimes worth 1 month.


I think bounties in the Julia community would work mostly if a company wants support for something specific, which would happen anyway, but they would prefer to make it happen sooner.

Other than this, people do a lot of high-quality work for free, and since they do stuff they are interested in and use, this makes it likely that it is maintained in the long run. IMO bounties for just doing some programming does not have the right incentive structure for this in a language as complex and fast-moving as Julia; if the author is not a user, any reasonably complex code would inevitable succumb to bit rot in a short timeframe.


Implicitly in your comment, bounties are indeed very much industry-focused. With a lot of academic users it’s tough to properly fund a bounty. I mean, what would you say to the granting agency if you put $3,000 of grant money as bounty on some feature? Although it’s somewhat similar to just having a paid postdoc who does such work, it’s not something that is accepted in our current structures. Though this might be something we want to push for as research-focused OSS devs.


There are many issues with bounties:

  • Big issues (those likely to receive bounties) are rarely closed by a single contribution. Instead they are the culmination of several pieces of work over time, often involving several contributors.
  • Changes which will affect large pieces of the project (such as compiler changes) will require input and feedback from many stakeholders: this takes time and effort on their part, and they may not appreciate being used as free labour so that someone else can claim a bounty.
  • The changes then have to be maintained: if someone claims a bounty to write a Julia debugger that then breaks in the next release, who is responsible for fixing it? Should they get paid again?
  • There is also the risk that the change may not be accepted (or even wanted), in which case everyone has just wasted their time.

Funding OSS is hard.