I have a use case for precompile(), which I believe deviates a bit from the original intent and implementation of the function (caching of compiled code in the precompile phase for packages, as I understand it).
I’m building a realtime system which loads dynamic code (changes from run to run). Code loading is not performance critical, but later use is. What I’d ideally want is an option to precompile() that will compile all reachable functions for the entire function graph. I haven’t looked at the implementation itself so I’m not sure how big or small and ask this is, but it would be very helpful the way I’m trying to use it. Currently I fall back to warming up my function with different input data to try to exercise all code paths, but this can quickly become complicated.
I do not have an answer for you.
But in view of multiple dispatch, I was wondering: do you know all the types of the arguments (and all combinations thereof) that will be called?
I know the types of arguments I’m calling the function with, and in my case all types in the call chain should be inferable. However, I see your point, in the general case, and possibly in some corner cases I haven’t considered, types are not always statically inferable.
Even worse: that’s not even well-defined in a multiple dispatch language when duck typing gets involved. But if you do know a set that you want to compile on, then you can do it. This is what SnoopCompile.jl does:
I looked at SnoopCompile.jl, and from what I gather it only records what functions were compiled as a result of invocation. That’s effectively what my warmup does already, and there’s no caching component to this in my system. I only want to compile ahead of time, before realtime events start coming in. The only problem with warmup, that I was hoping the compiler will have enough information to do a better job at (when it can statically determine types) is to do an exhaustive compilation of the call graph.
SnoopCompile creates precompilation statements that invoke Julia’s precompilation on those methods with those arguments. If what you wanted was for Julia to precompile a specific set of methods like your title said, then that does it.
The issue that you’re getting at is that precompilation is not the same as actually caching the compiled function. That is correct. I hope that changes in the future, which would make SnoopCompile a full solution here, but for now it isn’t. Instead, you would need to ahead of time (AOT) compile the headers generated by SnoopCompile. PackageCompiler.jl does just that:
Thanks, I appreciate your help and input. I’ve looked at PackageCompiler as well, and SnoopCompile. I’m either missing something, or I’m not getting across what problem I’m trying to solve.
I have a system that receives code at runtime, scripts for doing event processing and performing online computations. The code for the scripts is not known until loaded, and the combination of scripts loaded (through a DSL for an event processing graph) end up generating new and potentially different and unique code each time. The scope for using existing Julia precompilation tools I think is in compiling ahead of time as much of the functionality that these scripts bind together as possible, but the scripts themselves, and the resulting event processing graph(s) are not precompilable with these tools.
Maybe this was clear already and I’m misunderstanding something. The most natural solution for me would be to have a precompile() that exhaustively compiles the full call graph limited by where types are statically known or not.
Yeah, I don’t think that exists yet. When precompilation becomes more comprehensive then I think something like SnoopCompile which generates the compilation commands you need from that script would be the true answer, but I think you’ll have to wait.