Need help with incomprehensible typeinference

Hi there, I am the author of ExtensibleEffects.jl.

This is an advanced functional package, hence please bear with me if the following example looks totally alien and not understandable why you ever would like to do something like this. It is a minified version of the actual ExtensibleEffects code, and I am very sorry that I was not able to simplify it anyway further so far. At least it is reproducible and fits into a discourse issue, hence I want to ask for help.

The Goal

using ExtensibleEffects
using TypeClasses
using Test

vector_of_eff_of_vector = map(x -> noeffect([x]), [1, 20])
e1 = vector_of_eff_of_vector[1]
e2 = vector_of_eff_of_vector[2]

# some functional monadic helpers to work WITHIN the effects
mygoal(e1, e2) = @syntax_flatmap begin
    v1 = e1
    v2 = e2
    @pure [v1; v2]
end
julia> mygoal(e1, e2)
Eff(effectful=NoEffect{Vector{Int64}}([1, 20]), length(cont)=0)

julia> @inferred mygoal(e1, e2)
ERROR: return type ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}} does not match inferred return type ExtensibleEffects.Eff

julia> @code_warntype mygoal(e1, e2) 
# on the terminal this gives nice color output and will show that only the last step does not infer

Let’s look at details of this type inference (which are not understandable to me)

The fancy syntax above boils down to something like the following

function test_fails(e1, e2)
    combine(v1, v2) = [v1; v2]
    curried_combine(v1) = v2 -> combine(v1, v2)
    
    e1_f = map(curried_combine, e1)
    f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
    TypeClasses.flatmap(f_flatmap, e1_f)
end

@inferred test_fails(e1, e2)  # same as before
# ERROR: return type ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}} does not match inferred return type ExtensibleEffects.Eff
@code_warntype test_fails(e1, e2)  # same as before

It fails analogously.

However a slight variation does not fail:

function prepare_test(e1, e2)
    combine(v1, v2) = [v1; v2]
    curried_combine(v1) = v2 -> combine(v1, v2)
    
    e1_f = map(curried_combine, e1)
    f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
    f_flatmap, e1_f
end

f_flatmap, e1_f = prepare_test(e1, e2)
@inferred TypeClasses.flatmap(f_flatmap, e1_f)  # infers perfectly

It also works if these two steps are again put into a function

function test_infers(e1, e2)
    f_flatmap, e1_f = prepare_test(e1, e2)
    TypeClasses.flatmap(f_flatmap, e1_f)
end

@inferred test_infers(e1, e2)  # infers perfectly

This drives me crazy. I do not understand the reason behind it.

These issues corrupt the performance of ExtensibleEffects.jl at many many places.
Any help to understand what is going on is highly appreciated :heart:.

(using Julia 1.7.1)

2 Likes

I’m at a loss here myself. What’s interesting is that I managed to get the Julia session in a state where test_infer fails:

julia> @inferred test_infers(e1, e2)
ERROR: return type ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}} does not match inferred return type ExtensibleEffects.Eff
Stacktrace:
 [1] error(s::String)
   @ Base .\error.jl:33
 [2] top-level scope
   @ REPL[54]:1

julia> @code_warntype test_infers(e1, e2)
MethodInstance for test_infers(::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}, ::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}})
  from test_infers(e1, e2) in Main at REPL[48]:1
Arguments
  #self#::Core.Const(test_infers)
  e1::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}
  e2::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}
Locals
  @_4::Int64
  e1_f::ExtensibleEffects.Eff{NoEffect{var"#31#35"{Vector{Int64}, var"#combine#33"}}, Tuple{}}
  f_flatmap::var"#f_flatmap#36"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}
Body::ExtensibleEffects.Eff
1 ─ %1 = Main.prepare_test(e1, e2)::Tuple{var"#f_flatmap#36"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}, ExtensibleEffects.Eff{NoEffect{var"#31#35"{Vector{Int64}, var"#combine#33"}}, Tuple{}}}
│   %2 = Base.indexed_iterate(%1, 1)::Core.PartialStruct(Tuple{var"#f_flatmap#36"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}, Int64}, Any[var"#f_flatmap#36"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}, Core.Const(2)])
│        (f_flatmap = Core.getfield(%2, 1))
│        (@_4 = Core.getfield(%2, 2))
│   %5 = Base.indexed_iterate(%1, 2, @_4::Core.Const(2))::Core.PartialStruct(Tuple{ExtensibleEffects.Eff{NoEffect{var"#31#35"{Vector{Int64}, var"#combine#33"}}, Tuple{}}, Int64}, Any[ExtensibleEffects.Eff{NoEffect{var"#31#35"{Vector{Int64}, var"#combine#33"}}, Tuple{}}, Core.Const(3)])
│        (e1_f = Core.getfield(%5, 1))
│   %7 = TypeClasses.flatmap::Core.Const(TypeClasses.flatmap)
│   %8 = f_flatmap::var"#f_flatmap#36"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}
│   %9 = (%7)(%8, e1_f)::ExtensibleEffects.Eff
└──      return %9

Compare to a session where it succeeds:

julia> @code_warntype test_infers(e1, e2)
MethodInstance for test_infers(::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}, ::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}})
  from test_infers(e1, e2) in Main at REPL[53]:1
Arguments
  #self#::Core.Const(test_infers)
  e1::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}
  e2::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}
Locals
  @_4::Int64
  e1_f::ExtensibleEffects.Eff{NoEffect{var"#53#57"{Vector{Int64}, var"#combine#55"}}, Tuple{}}
  f_flatmap::var"#f_flatmap#58"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}
Body::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}
1 ─ %1 = Main.prepare_test(e1, e2)::Tuple{var"#f_flatmap#58"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}, ExtensibleEffects.Eff{NoEffect{var"#53#57"{Vector{Int64}, var"#combine#55"}}, Tuple{}}}
│   %2 = Base.indexed_iterate(%1, 1)::Core.PartialStruct(Tuple{var"#f_flatmap#58"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}, Int64}, Any[var"#f_flatmap#58"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}, Core.Const(2)])
│        (f_flatmap = Core.getfield(%2, 1))
│        (@_4 = Core.getfield(%2, 2))
│   %5 = Base.indexed_iterate(%1, 2, @_4::Core.Const(2))::Core.PartialStruct(Tuple{ExtensibleEffects.Eff{NoEffect{var"#53#57"{Vector{Int64}, var"#combine#55"}}, Tuple{}}, Int64}, Any[ExtensibleEffects.Eff{NoEffect{var"#53#57"{Vector{Int64}, var"#combine#55"}}, Tuple{}}, Core.Const(3)])
│        (e1_f = Core.getfield(%5, 1))
│   %7 = TypeClasses.flatmap::Core.Const(TypeClasses.flatmap)
│   %8 = f_flatmap::var"#f_flatmap#58"{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}
│   %9 = (%7)(%8, e1_f)::ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}
└──      return %9

please try to leave this thread uncluttered. I already tried to summarize everything as needed, so that people do not need to post macroexpand or the output of @code_warntype itself. It just makes it harder for others to follow.

I rather look for help or reproducible variations which can further enlighten the case.

@mkitti can you please put your large codeblocks into <details> </details> blocks, so that this thread gets a bit easier to read?

1 Like

I’m going to leave the second post as is because it appears that the inference engine is inconsistent.

2 Likes
Here's a full REPL session. where inference seems to break, but maybe I'm missing something with globals and softscope.
julia> using ExtensibleEffects

julia> using TypeClasses

julia> using Test

julia>

julia> vector_of_eff_of_vector = map(x -> noeffect([x]), [1, 20])
2-element Vector{ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}}}:
 Eff(effectful=NoEffect{Vector{Int64}}([1]), length(cont)=0)
 Eff(effectful=NoEffect{Vector{Int64}}([20]), length(cont)=0)

julia> e1 = vector_of_eff_of_vector[1]
Eff(effectful=NoEffect{Vector{Int64}}([1]), length(cont)=0)

julia> e2 = vector_of_eff_of_vector[2]
Eff(effectful=NoEffect{Vector{Int64}}([20]), length(cont)=0)

julia> function prepare_test(e1, e2)
           combine(v1, v2) = [v1; v2]
           curried_combine(v1) = v2 -> combine(v1, v2)

           e1_f = map(curried_combine, e1)
           f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
           f_flatmap, e1_f
       end
prepare_test (generic function with 1 method)

julia>

julia> f_flatmap, e1_f = prepare_test(e1, e2)
(f_flatmap, Eff(effectful=NoEffect{var"#3#7"{Vector{Int64}, var"#combine#5"}}(var"#3#7"{Vector{Int64}, var"#combine#5"}([1], var"#combine#5"())), length(cont)=0))

julia> @inferred TypeClasses.flatmap(f_flatmap, e1_f)  # infers perfectly
Eff(effectful=NoEffect{Vector{Int64}}([1, 20]), length(cont)=0)

julia> function test_infers(e1, e2)
           f_flatmap, e1_f = prepare_test(e1, e2)
           TypeClasses.flatmap(f_flatmap, e1_f)
       end
test_infers (generic function with 1 method)

julia>

julia> @inferred test_infers(e1, e2)  # infers perfectly
Eff(effectful=NoEffect{Vector{Int64}}([1, 20]), length(cont)=0)

julia>     combine(v1, v2) = [v1; v2]
ERROR: error in method definition: function TypeClasses.combine must be explicitly imported to be extended
Stacktrace:
 [1] top-level scope
   @ none:0
 [2] top-level scope
   @ REPL[14]:1

julia>     curried_combine(v1) = v2 -> combine(v1, v2)
curried_combine (generic function with 1 method)

julia>     e1_f = map(curried_combine, e1)
Eff(effectful=NoEffect{var"#10#11"{Vector{Int64}}}(var"#10#11"{Vector{Int64}}([1])), length(cont)=0)

julia>     f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
ERROR: cannot define function f_flatmap; it already has a value
Stacktrace:
 [1] top-level scope
   @ none:0
 [2] top-level scope
   @ REPL[17]:1

julia>     f_flatmap, e1_f
(f_flatmap, Eff(effectful=NoEffect{var"#10#11"{Vector{Int64}}}(var"#10#11"{Vector{Int64}}([1])), length(cont)=0))

julia> function prepare_test(e1, e2)
           combine(v1, v2) = [v1; v2]
           curried_combine(v1) = v2 -> combine(v1, v2)

           e1_f = map(curried_combine, e1)
           f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
           f_flatmap, e1_f
       end
prepare_test (generic function with 1 method)

julia>

julia> f_flatmap, e1_f = prepare_test(e1, e2)
(f_flatmap, Eff(effectful=NoEffect{var"#14#18"{Vector{Int64}, var"#combine#16"}}(var"#14#18"{Vector{Int64}, var"#combine#16"}([1], var"#combine#16"())), length(cont)=0))

julia> @inferred TypeClasses.flatmap(f_flatmap, e1_f)  # infers perfectly
Eff(effectful=NoEffect{Vector{Int64}}([1, 20]), length(cont)=0)

julia> function test_infers(e1, e2)
           f_flatmap, e1_f = prepare_test(e1, e2)
           TypeClasses.flatmap(f_flatmap, e1_f)
       end
test_infers (generic function with 1 method)

julia>

julia> @inferred test_infers(e1, e2)  # infers perfectly
Eff(effectful=NoEffect{Vector{Int64}}([1, 20]), length(cont)=0)

julia> function prepare_test(e1, e2)
           combine(v1, v2) = [v1; v2]
           curried_combine(v1) = v2 -> combine(v1, v2)

           e1_f = map(curried_combine, e1)
           f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
           f_flatmap, e1_f
       end
prepare_test (generic function with 1 method)

julia> function test_infers(e1, e2)
           f_flatmap, e1_f = prepare_test(e1, e2)
           TypeClasses.flatmap(f_flatmap, e1_f)
       end
test_infers (generic function with 1 method)

julia>

julia> @inferred test_infers(e1, e2)  # infers perfectly
ERROR: return type ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}} does not match inferred return type ExtensibleEffects.Eff
Stacktrace:
 [1] error(s::String)
   @ Base .\error.jl:33
 [2] top-level scope
   @ REPL[26]:100: 
Copy and pasting this into my Julia 1.7.2 REPL results in `@inferred test_infers(e1, e2)` failing.
using Pkg
Pkg.activate(; temp = true)
Pkg.add(["ExtensibleEffects", "TypeClasses", "Test"])

using ExtensibleEffects
using TypeClasses
using Test

vector_of_eff_of_vector = map(x -> noeffect([x]), [1, 20])

e1 = vector_of_eff_of_vector[1]
e2 = vector_of_eff_of_vector[2]

function prepare_test(e1, e2)
    combine(v1, v2) = [v1; v2]
    curried_combine(v1) = v2 -> combine(v1, v2)

    e1_f = map(curried_combine, e1)
    f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
    f_flatmap, e1_f
end

f_flatmap, e1_f = prepare_test(e1, e2)

@inferred TypeClasses.flatmap(f_flatmap, e1_f)  # infers perfectly

function test_infers(e1, e2)
    f_flatmap, e1_f = prepare_test(e1, e2)
    TypeClasses.flatmap(f_flatmap, e1_f)
end

@inferred test_infers(e1, e2)  # infers perfectly

curried_combine(v1) = v2 -> combine(v1, v2)

e1_f = map(curried_combine, e1)

f_flatmap, e1_f

function prepare_test(e1, e2)
    combine(v1, v2) = [v1; v2]
    curried_combine(v1) = v2 -> combine(v1, v2)

    e1_f = map(curried_combine, e1)
    f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
    f_flatmap, e1_f
end

f_flatmap, e1_f = prepare_test(e1, e2)

@inferred TypeClasses.flatmap(f_flatmap, e1_f)  # infers perfectly

function test_infers(e1, e2)
    f_flatmap, e1_f = prepare_test(e1, e2)
    TypeClasses.flatmap(f_flatmap, e1_f)
end

@inferred test_infers(e1, e2)  # infers perfectly

function prepare_test(e1, e2)
    combine(v1, v2) = [v1; v2]
    curried_combine(v1) = v2 -> combine(v1, v2)

    e1_f = map(curried_combine, e1)
    f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
    f_flatmap, e1_f
end

function test_infers(e1, e2)
    f_flatmap, e1_f = prepare_test(e1, e2)
    TypeClasses.flatmap(f_flatmap, e1_f)
end
@inferred test_infers(e1, e2)  # now it breaks?

The session is reproducible on multiple computers. There appears to be a bug with the inference engine.

Minimized script resulting in broken inference
using Pkg
Pkg.activate(; temp = true)
Pkg.add(["ExtensibleEffects", "TypeClasses", "Test"])

using ExtensibleEffects
using TypeClasses
using Test

vector_of_eff_of_vector = map(x -> noeffect([x]), [1, 20])

e1 = vector_of_eff_of_vector[1]
e2 = vector_of_eff_of_vector[2]

function prepare_test(e1, e2)
    combine(v1, v2) = [v1; v2]
    curried_combine(v1) = v2 -> combine(v1, v2)

    e1_f = map(curried_combine, e1)
    f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
    f_flatmap, e1_f
end

function test_infers(e1, e2)
    f_flatmap, e1_f = prepare_test(e1, e2)
    TypeClasses.flatmap(f_flatmap, e1_f)
end

@inferred test_infers(e1, e2)  # now it breaks?
Minimal script where inference succeeds. Running `TypeClasses.flatmap(...)` has a side effect.
using Pkg
Pkg.activate(; temp = true)
Pkg.add(["ExtensibleEffects", "TypeClasses", "Test"])

using ExtensibleEffects
using TypeClasses
using Test

vector_of_eff_of_vector = map(x -> noeffect([x]), [1, 20])

e1 = vector_of_eff_of_vector[1]
e2 = vector_of_eff_of_vector[2]

function prepare_test(e1, e2)
    combine(v1, v2) = [v1; v2]
    curried_combine(v1) = v2 -> combine(v1, v2)

    e1_f = map(curried_combine, e1)
    f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
    f_flatmap, e1_f
end

TypeClasses.flatmap(prepare_test(e1, e2)...)  # infers perfectly

function test_infers(e1, e2)
    f_flatmap, e1_f = prepare_test(e1, e2)
    TypeClasses.flatmap(f_flatmap, e1_f)
end

@inferred test_infers(e1, e2)  # infers perfectly
Minimal script where inference succeeds at first, and then fails. Redefining `prepare_test` invalidates the side effect.
using Pkg
Pkg.activate(; temp = true)
Pkg.add(["ExtensibleEffects", "TypeClasses", "Test"])

using ExtensibleEffects
using TypeClasses
using Test

vector_of_eff_of_vector = map(x -> noeffect([x]), [1, 20])

e1 = vector_of_eff_of_vector[1]
e2 = vector_of_eff_of_vector[2]

function prepare_test(e1, e2)
    combine(v1, v2) = [v1; v2]
    curried_combine(v1) = v2 -> combine(v1, v2)

    e1_f = map(curried_combine, e1)
    f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
    f_flatmap, e1_f
end

TypeClasses.flatmap(prepare_test(e1, e2)...)  # infers perfectly

function test_infers(e1, e2)
    f_flatmap, e1_f = prepare_test(e1, e2)
    TypeClasses.flatmap(f_flatmap, e1_f)
end

@inferred test_infers(e1, e2)  # infers perfectly

function prepare_test(e1, e2)
    combine(v1, v2) = [v1; v2]
    curried_combine(v1) = v2 -> combine(v1, v2)

    e1_f = map(curried_combine, e1)
    f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
    f_flatmap, e1_f
end

@inferred test_infers(e1, e2)  # now it breaks?
2 Likes

Reminiscent of this thread, where I’ve just now also confirmed the effect where redefining a callee function causes the caller function to be inferred as type-unstable again. Test.@inferred is actually trickier than @code_warntype in that thread’s example, the order of running+compiling the callee and caller methods affects whether @inferred gets “stuck” reporting an instability, and the trick of redefining a method to “undo” its compilation can be used to fix that compilation order and make @inferred work.

1 Like

Thank you @mkitti that executing TypeClasses.flatmap(prepare_test(e1, e2)...) somehow introduces a type-inference side-effect is very impressive and mind-blowing.

I wish we could have some Julia core developer commenting on this type-inference behaviour

1 Like

@tim.holy @kristoffer.carlsson can you take a look?

Maybe we should try GitHub - timholy/SnoopCompile.jl: Making packages work faster with more extensive precompilation

1 Like

I posted this issue now on Julia github, summarizing it that the typeinference is dependend on the exeuction order.

Big thank you @mkitti for your help to reach this understanding.

1 Like