Potential performance regressions in Julia 1.8 for special un-precompiled type dispatches and how to fix them

Regarding the output format for the invalidations reports, weโ€™ve incorporated a relatively generic way of creating tables (e.g., below) that are similar to the plots made (e.g. here) but with more information (file / line number / method names) in ReportMetrics.jl. The script looks like

using SnoopCompileCore
invalidations = @snoopr begin
    # load packages & do representative work
import ReportMetrics
    job_name = "invalidations",
    process_filename = x -> last(split(x, "packages/")),

And the output table looks like, for example:

โ”‚ <file name>:<line number>                           โ”‚    Method Name    โ”‚ Invalidations โ”‚ Invalidations % โ”‚
โ”‚                                                     โ”‚                   โ”‚    Number     โ”‚     (xแตข/โˆ‘x)     โ”‚
โ”‚ ChainRulesCore/oBjCg/src/tangent_types/thunks.jl:29 โ”‚ ChainRulesCore.== โ”‚      179      โ”‚       63        โ”‚
โ”‚ ChainRulesCore/oBjCg/src/tangent_types/thunks.jl:28 โ”‚ ChainRulesCore.== โ”‚      104      โ”‚       36        โ”‚
โ”‚ ChainRulesCore/oBjCg/src/tangent_arithmetic.jl:105  โ”‚ ChainRulesCore.*  โ”‚       2       โ”‚        1        โ”‚

Could this be somehow incorporated into the GitHub action for invalidations?


That looks great! I think it would be very nice to incorporate it into the invalidations CI jobs.

My only question is: where should the code be added? My first thought was to make julia-invalidations depend on ReportMetrics.jl, but that package also includes dependencies for reporting allocations. Alternatively, we could extract pieces from report_invalidations (my thought was to include everything except the code that needs PrettyTables) and add it to SnoopCompile. That might be nicer since it seems to be the lowest common denominator. Iโ€™ll open a PR and see what @tim.holy thinks :slightly_smiling_face:

1 Like

Sounds good. Please let us know about the progress. I think it would be great to get a nice invalidation report in CI for many packages.

1 Like

Will do, so far Iโ€™ve opened Add tabulated_invalidations to tabulate inv summary by charleskawczynski ยท Pull Request #303 ยท timholy/SnoopCompile.jl ยท GitHub

1 Like

@sloede Do you have the bandwidth to re-run the benchmarks? There has been some significant work in Julia between versions 1.8.0 and 1.8.2. I used three Julia versions with the following code in a quick and dirty experiment:

julia -e 'import Pkg; Pkg.activate(temp=true); Pkg.add(["OrdinaryDiffEq", "Trixi"]); @time @eval(using OrdinaryDiffEq); @time @eval(using Trixi); @time trixi_include(joinpath(examples_dir(), "tree_3d_dgsem", "elixir_euler_taylor_green_vortex.jl"), maxiters=200)'

Then, I get the following timings (in seconds):

Julia 1.7.3 Julia 1.8.0 Julia 1.8.2
using OrdinaryDiffEq ca. 4.0 ca. 6.5 ca. 6.5
using Trixi ca. 14.5 ca. 18.5 ca. 16.5
trixi_include(...) ca. 50 ca. 57 ca. 43

If you can reproduce this trend by more scientific measurements (several repetitions etc.), this should indicate that we are on the right track.