How to measure precompilation time?

Source of latency Measurement method
Precompilation time ???
Load time @time_imports using MyPackage
Compilation time @time MyPackage.myfunc()

I could do @time Pkg.add("MyPackage") but it combines the download and the precompilation. The documentation on modules mentions Base.compilecache(Base.identify_package("MyPackage")), is that the right command to time?

3 Likes

compilecache and those docs reads like that precompilation occurs during using/import, different from the precompilation reported when a package is added. I’m not sure what precompilation happens when, or even exactly what parts of the package is not included in the precompilation besides calls in __init__, but I’m sure there’s at least 2 times based on this blogpost on precompilation:

We’ll focus on precompilation,

julia> using SomePkg
[ Info: Precompiling SomePkg [12345678-abcd-9876-efab-1234abcd5e6f]

or the related Precompiling project... output that occurs after updating packages on Julia 1.6

It’d be nice to figure out a timeline from package-adding to user calls of what inside a package is precompiled, compiled, and executed, as well as the ways we can measure the steps.

3 Likes
  • close VSCode and all other programmes
  • rename the .julia folder
  • run using Pkg; @time using <mypackage>
  • restart Julia and run the same code
  • substract the second time from the first time
  • delete the .julia folder and move back the original version

This gives you the precompilation time of mypackage and all dependant packages…
It might include the download time, not sure how to avoid that.

Is that what you want to know?

Is it not Pkg.precompile(p)?

I think those two times are the same, what makes you say they’re different? To me the difference is just the set of packages that get precompiled (the one you’ve added vs the ones necessary for the project that have not been precompiled yet / have changed since)
https://pkgdocs.julialang.org/v1/environments/#Environment-Precompilation

I explored this option, which I called the nuclear option. Looking for something slightly less disruptive ^^

Indeed that looks promising, and I can even deactivate precompilation after download
https://pkgdocs.julialang.org/v1/api/#Pkg.precompile

To measure precompilation or the total TTFX that includes precompilation, a scriptable approach is to create a temp DEPOT directory and pass it to Julia.
Basically,

code = """
import Pkg
# calls to precompile etc
...
"""
projecttoml = """
[deps]
...
"""
mktempdir() do depot
    # if needed - copy registries:
    # mkdir(joinpath(depot, "registries"))
    # cp(expanduser("~/.julia/registries/General"), joinpath(depot, "registries/General"))
    # ...
    mktempdir() do env
        cd(env)
        write("Project.toml", projecttoml)
        write("script.jl", code)
        run(addenv(`julia --project script.jl`, Dict("JULIA_DEPOT_PATH" => depot)))
    end
end

See a full script at a gist of mine at env_benchmark.jl Β· GitHub. That specific script measures how long install + precompile + use will take if you create a new env with the same packages each 10 days (all updated deps will precompile).

3 Likes

Have you considered the following?

@time Base.compilecache(Base.identify_package("MyPackage"))

This is also compatible with BenchmarkTools.jl. Here is a demo with MortgageCalculators.jl.

julia> using BenchmarkTools

julia> @benchmark Base.compilecache(Base.identify_package("MortgageCalculators"))
[ Info: Precompiling MortgageCalculators [cbda39b7-a5b8-49f7-bf50-e22af4f2d7a9]
[ Info: Precompiling MortgageCalculators [cbda39b7-a5b8-49f7-bf50-e22af4f2d7a9]
[ Info: Precompiling MortgageCalculators [cbda39b7-a5b8-49f7-bf50-e22af4f2d7a9]
[ Info: Precompiling MortgageCalculators [cbda39b7-a5b8-49f7-bf50-e22af4f2d7a9]
[ Info: Precompiling MortgageCalculators [cbda39b7-a5b8-49f7-bf50-e22af4f2d7a9]
[ Info: Precompiling MortgageCalculators [cbda39b7-a5b8-49f7-bf50-e22af4f2d7a9]
[ Info: Precompiling MortgageCalculators [cbda39b7-a5b8-49f7-bf50-e22af4f2d7a9]
BenchmarkTools.Trial: 2 samples with 1 evaluation.
 Range (min … max):  3.557 s …   3.676 s  β”Š GC (min … max): 0.00% … 0.00%
 Time  (median):     3.616 s              β”Š GC (median):    0.00%
 Time  (mean Β± Οƒ):   3.616 s Β± 83.759 ms  β”Š GC (mean Β± Οƒ):  0.00% Β± 0.00%

  β–ˆ                                                       β–ˆ  
  β–ˆβ–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–β–ˆ ▁
  3.56 s         Histogram: frequency by time        3.68 s <

 Memory estimate: 311.12 KiB, allocs estimate: 3381.

Edit after reading more of the thread: The above measures the precompilation of the specific package, not it’s dependencies. It also invalidates any pre-existing cache that may exist.

If you to measure the total precompilation time, you may need to create a temporary Julia depot by setting ENV["JULIA_DEPOT_PATH"] or manipulating DEPOT_PATH. Even then, you may want to erase the path at joinpath(DEPOT_PATH[1], "compiled") before doing a Pkg.precompile().

1 Like

Does it assume that the dependencies are also precompiled? If not, do they get JIT-compiled and then discarded?

Very cool that it works with BenchmarkTools, is there any way to measure load time in the same way? As in, without restarting the session?

The dependencies will only get re-precompiled if there was some event to invalidate the compile cache. This is why you might want a temporary depot.

My guess is there may be a route via Base.require but looking at the implementation of Revise.jl may give another hint.

The other thought I have is is just running another Julia process. You can use Base.julia_cmd() for example.

1 Like

What I mean is, I wondered whether the timing in this scenario includes precompilation of dependencies. And I guess this depends on the state of the depot