Is it possible to save the compiled version of a function, f, which is not in a package, using PackageCompiler? For example, suppose I have a file model.jl which contains:
module Model
include("func.jl")
end
f(rand(2,2))
where func.jl contains the julia code for a function f. I would really like to be able to save the compiled version of f for reuse in some downstream application. Is this possible?
Longer story of where this comes from: I am using FastDifferentiation to generate the jacobian of a large system of equations. This jacobian takes a few hours to calculate, and the resultant function takes another few hours to compile when you call it first. I would like to be able to reuse this compiled function across many different independent Julia processes. Right now I use FastDiff to generate an expression of this function, which I then save as a file (hence the file func.jl in the example). This works great in the repl, but I am so far unable to save the compiled function
I’m the author of FastDifferentiation.jl. My experience has been that computing the symbolic derivative is generally pretty fast, even for fairly big functions.
But, compiling the code to an executable can get really slow when the symbolic derivative transforms to a julia function that is a hundred thousand lines of code or more. The LLVM compiler takes a long time compiling programs that big.
If it’s the compilation stage that’s not something I can fix. But It is possible there is a bug in FastDiff that is slowing things down. If you can share your code I’ll look at it to see if it can be sped up.
@gdalle the construction and compilation of the jacobian takes long, but then evaluation is super quick (<< 1 second), so FastDiff is actually a very good solution for my use case once you have paid the derivative creation price.
@brianguenter unfortunately the function I make is fairly non-trivial to construct, it is made from an optimization model. If you want, I can export the FastDiff function that gets differentiated, but I do not think the problem lies with FastDiff.
I have been using it for smaller problems, and it works great. As you mention, the problem lies with the size of the resulting jacobian. At the end of the day, my jacobian has size 10,000 x 10,000, but is fortunately sparse with only about 2.5 million entries. The expression returned by FastDiff has about 2.6 million lines of code once I have saved it to a file. Compiling this takes long, and ideally I would only like to do it once.
I think the only real solution here is to save the precompiled function, but I am struggling to figure out how to do that with PackageCompiler
If FastDifferentiation.jl works for you, that’s great. For my own curiosity though, have you tried sparse autodiff with DifferentiationInterface.jl + ForwardDiff.jl?
How long does one evaluation of your function take?
Technically yes but not in a way you’d like. The only non-package option is to save compiled code in a system image, aka sysimage, but that’s loaded for the overall Julia process along with the basic things like the compiler and some Base function calls. That sysimage takes longer to build because of all those other things, and there’s quite a bit of manual rebuilding whenever something changes (like upgrading Julia).
That’s why most precompilation is based on a minimal unit of a package image. The package’s module makes a discrete namespace so names have a consistent meaning, and the package’s project file specifies acceptable versions of dependencies. You don’t even have to use PackageCompiler for its apps and shared libraries, packages already precompile by default and can be imported. As a first step, try to make a package with the structure:
module Model
include("func.jl")
precompile(f, (Matrix{Float64},)) # call signature of f(rand(2,2))
end
Following the rest of the linked instructions to activate that package’s environment then import the function should report whether precompilation is going on. Confirm if the precompilation was cached by closing the session and trying the activation and import in a fresh session. Note that any changes to dependencies, including Julia itself, requires re-precompilation, but that should be detected and attempted automatically, unlike for sysimages.
If that first step works, you can move on to learning how to add that package to other environments.
@gdalle I have not actually tried ForwardDiff yet, I reasoned that since the function we are differentiating is wrt to a large vector (about 10,000 variables) that it would not be a good choice. I can try it out, but at the moment the package I am using to construct the function that gets differentiated is sort of married to FastDiff. It will take me a little while to try out the ForwardDiff approach
That reasoning would work if you needed a gradient, but since you’re computing a square Jacobian, sparse forward-mode algorithmic differentiation becomes a solid candidate. The main exception would be if some of the rows in the Jacobian were dense or almost dense.
Again, if it ain’t broke, don’t fix it, and FastDifferentiation seems to do a very good job. But since your preconceptions about ForwardDiff seemed incomplete, I’d be interested to see how well it works. Another reason I suggested it because code that is compatible with FastDifferentiation is more likely to accept dual numbers and hence work with ForwardDiff.