Hey folks,
I’m working on improving time to first X in Term.jl by precompilation etc, but it’s not a tool I’m super familiar with these things so I was hoping someone might give me some advice. For reference, I’ve red some of the blog posts out there and the docs for SnoopCompile.jl
I’m using SnoopPrecompile.jl to pre-compile code for a high-level constructor Panel
.
If I run:
using Term
@time Panel(txt);
without any precompilation:
2.044457 seconds (6.98 M allocations: 365.009 MiB, 14.36% gc time, 99.91% compilation time)
with:
1.017123 seconds (1.02 M allocations: 52.379 MiB, 1.67% gc time, 99.89% compilation time: 88% of which was recompilation)
Obviously an improvement, but why is so much time still being spent re-compiling a lot of stuff?
In my pre-compilation code I have:
@precompile_setup begin
@precompile_all_calls begin
Panel(txt);
end
end
so I’m calling exactly the same thing that I am then running again, why is it being recompiled?
From what I can gather online re-compilation is usually related to methods invalidation, but given that I’m not defining any new methods there, why are they getting invalidated? Also:
invalidations = @snoopr begin
Panel(txt);
end
println("Num. invalidated methods: ", length(uinvalidated(invalidations)))
reports no methods invalidation.
Looking at inference with @snoopi_deep
seems to suggest that inference is not a main issue either:
So, sorry for the potentially dumb question, but can someone explain to me how running the same function that I’ve pre-compiled triggers novel compilation? Also all subsequent calls to Panel(txt)
after the first will not cause re-compilation.
Thank you,
Federico