`log` performance regression and JET failures

Hi, I’m seeing some surprising results for log in 1.8.0-beta1. First, 1.7:

julia> versioninfo()
Julia Version 1.7.2
Commit bf53498635 (2022-02-06 15:21 UTC)
Platform Info:
  OS: Linux (x86_64-pc-linux-gnu)
  CPU: Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-12.0.1 (ORCJIT, skylake)

julia> @btime log(x) setup=(x=rand());
  5.371 ns (0 allocations: 0 bytes)

julia> JET.@test_opt log(2.1)
Test Passed
  Expression: #= REPL[17]:1 =# JET.@test_call analyzer = JET.OptAnalyzer log(2.1)

Then here’s 1.8:

julia> versioninfo()
Julia Version 1.8.0-beta1
Commit 7b711ce699 (2022-02-23 15:09 UTC)
Platform Info:
  OS: Linux (x86_64-pc-linux-gnu)
  CPU: 8 Γ— Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-13.0.1 (ORCJIT, skylake)
  Threads: 1 on 8 virtual cores

julia> @btime log(x) setup=(x=rand());
  6.548 ns (0 allocations: 0 bytes)

julia> @test_opt log(2.1)
JET-test failed at REPL[16]:1
  Expression: #= REPL[16]:1 =# JET.@test_call analyzer = JET.OptAnalyzer log(2.1)
  ═════ 1 possible error found ═════
  β”Œ @ special/log.jl:257 Base.Math.Val(:β„―)
  β”‚β”Œ @ essentials.jl:714 %1()
  β”‚β”‚ runtime dispatch detected: %1::Type{Val{_A}} where _A()
  │└─────────────────────
  
ERROR: There was an error during testing

It’s measurably slower, and fails JET.@test_opt. Is this a real problem, or maybe just a symptom of other changes or things being measured differently?

1 Like

It seems to be a bug of JET that @test_opt fails on this example. Especially JET still isn’t fully updated to account for the new effect system that will come with 1.8 – The call of Val(:β„―) should be folded as a constant and so JET shouldn’t report any dynamic dispatch within it.

As for the performance, I couldn’t confirm measurable runtime regression on my local machine.

2 Likes

FWIW, I filed this issue here.

1 Like

Great, thanks @aviatesk for the quick response!