`code_warntype` understanding

I get

julia> model(2.2; settings=[1,1,1.0])
ERROR: UndefKeywordError: keyword argument pars not assigned

when running your code.

Edit: With updated code I get that both of these infer to Any.

Edit 2: I think this has something to do with inference state dependence: For example:

julia> @code_warntype M(2.2) # still Body::Any
Variables
  #self#::Core.Const(M)
  x::Float64

Body::Any
1 ─ %1 = (:pars,)::Core.Const((:pars,))
β”‚   %2 = Core.apply_type(Core.NamedTuple, %1)::Core.Const(NamedTuple{(:pars,), T} where T<:Tuple)
β”‚   %3 = Base.vect(1, 1, 1.0)::Vector{Float64}
β”‚   %4 = Core.tuple(%3)::Tuple{Vector{Float64}}
β”‚   %5 = (%2)(%4)::NamedTuple{(:pars,), Tuple{Vector{Float64}}}
β”‚   %6 = Core.kwfunc(Main.model)::Core.Const(Main.MyTest2.var"#model#3##kw"())
β”‚   %7 = (%6)(%5, Main.model, x)::Any
└──      return %7

julia> @eval Base begin
           sum(f, a; kw...) = mapreduce(f, add_sum, a; kw...)
       end
sum (generic function with 14 methods)

julia> @code_warntype M(2.2) # still Body::Any
Variables
  #self#::Core.Const(M)
  x::Float64

Body::ComplexF64
1 ─ %1 = (:pars,)::Core.Const((:pars,))
β”‚   %2 = Core.apply_type(Core.NamedTuple, %1)::Core.Const(NamedTuple{(:pars,), T} where T<:Tuple)
β”‚   %3 = Base.vect(1, 1, 1.0)::Vector{Float64}
β”‚   %4 = Core.tuple(%3)::Tuple{Vector{Float64}}
β”‚   %5 = (%2)(%4)::NamedTuple{(:pars,), Tuple{Vector{Float64}}}
β”‚   %6 = Core.kwfunc(Main.model)::Core.Const(Main.MyTest2.var"#model#3##kw"())
β”‚   %7 = (%6)(%5, Main.model, x)::ComplexF64
└──      return %7

The function I evalled is indentical to the earlier one and only serves to re-compile functions.

Before the eval we had:

  β”Œ @ reduce.jl:501 within `sum'
β”‚  β”‚ %3 = invoke Base.:(var"#sum#216")($(QuoteNode(Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}()))::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}, %1::typeof(sum), %2::Function, _4::Base.Generator{Base.Iterators.Zip{Tuple{Vector{Float64}, Vector{NamedTuple{(:N, :k, :p), Tuple{Int64, Float64, Float64}}}}}, Main.MyTest2.var"#2#5"{Float64}})::Any

after it changes to (look at return type in the end)

β”‚  β”Œ @ REPL[9]:2 within `sum'
β”‚  β”‚ %3 = invoke Base.:(var"#sum#865")($(QuoteNode(Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}()))::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}, %1::typeof(sum), %2::Function, _4::Base.Generator{Base.Iterators.Zip{Tuple{Vector{Float64}, Vector{NamedTuple{(:N, :k, :p), Tuple{Int64, Float64, Float64}}}}}, Main.MyTest2.var"#2#5"{Float64}})::ComplexF64

Also, see the allocation difference:

julia> @btime M(2.2)
  521.423 ns (13 allocations: 480 bytes)
-1.7454668760152625 + 4.251554706912252im

julia> @eval Base begin
       sum(f, a; kw...)= mapreduce(f, add_sum, a; kw...)
       end
sum (generic function with 14 methods)

julia> @btime M(2.2)
  440.561 ns (6 allocations: 256 bytes)
-1.7454668760152625 + 4.251554706912252im
4 Likes

Hi @kristoffer.carlsson, thanks for checking

what does it mean?

In the example below, if I comment the line marked with # !!!!!!!!!! I get ::Any.

module MyTest2
    using Parameters
    function build_model(list_of_settings)
        function model(x; pars)
            return sum(p*single_term_model(x; settings=s)
                for (p,s) in zip(pars, list_of_settings))
        end
        return model
    end
    function single_term_model(x; settings)
        @unpack N, k, p = settings
        value = sum((k*x)^i*cis(i*p*x) for i in 1:N)
        return value
    end
    export build_model
end

using .MyTest2

const list_of_settings = [(N=3,k=0.1,p=0.3),
                    (N=5,k=0.5,p=0.3),
                    (N=7,k=0.2,p=0.3)]
#
#!!!!!!!!!!
# @code_warntype MyTest2.single_term_model(2.2; settings=list_of_settings[1]) # correct! Complex{Float}
#!!!!!!!!!!
# 
const model = build_model(list_of_settings)

@code_warntype model(2.2; pars=[1.0, 1.0, 1.0]) # Body::Any

As soon as I uncomment the line, I get Body::Complex{Float64}

It seems that one needs to call the internal function once to make it’s type inferred.
Should I do it for every internal function… ?

This is pretty much what it means. The order that you inferred functions changed the result. I would classify it as an inference bug. There is some issue open about it but I can’t find it now.

4 Likes

Does this problem influence performance? E.g. when doing integration

I would think so, yes.

Annoying, but ok, I should check of it always, then. Thanks a lot.

With this the original problem is nearly resolved.
My main confusion was due to a global variable carrying the function model = build_model(...).

The issue I couldn’t find should be https://github.com/JuliaLang/julia/issues/35800.

2 Likes

Is there a version of sum comprehension that does not have the issue?
Something simple than that?

    function mysum(t::T where T<:Type, itrs::Base.Generator)
        v = zero(t)
        for it in itrs
            v += it
        end
        return v
    end    

If I am asked to select my favourite julia issue, perhaps, I would pick this one.
I use sum everywhere in my code…