Inference extremely slow for tiny program in Julia 0.6.4

The following code runs very slowly in Julia 0.6.4 (~3 seconds), but quickly in Julia 0.7.0. Any idea what’s going on?

function fun2()
    A .*= A ./ y
    B .*= B ./ y
    C .*= C ./ y

function fun1()
    x = 1
    x == 1 && return

for n = 1:1
    @time fun1()

This is how I test it:

julia -e '@time include("inference.jl"); @time include("inference.jl");'

In Julia 0.6.4:

  0.000000 seconds
  3.552294 seconds (6.27 M allocations: 299.748 MiB, 6.40% gc time)
  0.000000 seconds
  3.083271 seconds (5.92 M allocations: 280.767 MiB, 1.16% gc time)

In Julia 0.7.0:

  0.000000 seconds
  0.027557 seconds (27.76 k allocations: 1.366 MiB)
  0.000000 seconds
  0.017841 seconds (27.03 k allocations: 1.292 MiB)

Profiling shows all time spent in inference:

2070 ./event.jl:73; (::Base.REPL.##1#2{Base.REPL.REPLBackend})()
 2070 ./REPL.jl:97; macro expansion
  2070 ./REPL.jl:66; eval_user_input(::Any, ::Base.REPL.REPLBa...
   2070 ./boot.jl:235; eval(::Module, ::Any)
    2070 ./<missing>:?; anonymous
     2070 ./profile.jl:23; macro expansion
      2070 ./sysimg.jl:14; include(::String)
       2070 ./loading.jl:576; include_from_node1(::String)
        2057 ./inference.jl:2628; typeinf_ext(::Core.MethodInstance, :...
         2056 ./inference.jl:2787; typeinf(::Core.Inference.InferenceS...
          2056 ./inference.jl:2669; typeinf_work(::Core.Inference.Infer...
           2056 ./inference.jl:2076; abstract_interpret(::Any, ::Array{...
            2056 ./inference.jl:1950; abstract_eval(::Any, ::Array{Any,...
             2056 ./inference.jl:1927; abstract_eval_call(::Expr, ::Arra...
              2056 ./inference.jl:1897; abstract_call(::Any, ::Array{Any...
               2056 ./inference.jl:1420; abstract_call_gf_by_type(::Any, ...
                2056 ./inference.jl:2535; typeinf_edge(::Method, ::Any, :...

All of your variables are globals. Don’t use globals if you want performance. See these:

You’re right, I tried to create a minimal example to illustrate a problem I’m seeing in actual code, where I’m not using any globals. The following example has no globals, and is closer to my actual code, while still exhibiting the same problem.

function fun2(; A::AbstractVector{Float64} = [1.0])
    B = copy(A)
    B .*= (A-1.0) ./ (1.0-A)

function fun1()
    fun2(A = [1.0])


It uses 1 second in Julia 0.6.4 and 46 ms in Julia 0.7.0. Using @code_warntype, it seems that the problem stems from the use of an abstract keyword argument type? I don’t recall hearing about this before, is this an issue that has been fixed in Julia 0.7.0?

@code_warntype fun2(A = [1.0])

Julia 0.6.4:



Julia 0.7.0:

 1 ─ %1  = (Base.getfield)(#temp#, :A)::Array{Float64,1}

The broadcasting implementation got completely rewritten for 0.7 so the new implementation might just be nicer for inference.

Hmm, ok, that could explain part of it, but there’s also the keyword argument type instability, right? This is my understanding of what’s going on in my two examples, please correct me if something doesn’t sound right:

Example 1: A, B, C and y are not defined, so when fun2 is JIT compiled, their types are not known, and inference takes a long time in Julia 0.6.4, due to broadcasting + inference not working well together. In Julia 0.7, broadcasting + inference is a lot faster.

Example 2: The type of B is not known by the compiler in Julia 0.6.4, so again inference takes a long time. In Julia 0.7, the compiler is able to figure out the type of B, so this is fast (and would be fast regardless since broadcasting + inference works better).


  • Upgrade to 0.7 as soon as possible
  • While still on 0.6.4, avoid keyword arguments with abstract types, since it leads to type instability
1 Like

I’m not sure the type instability has much to do with the inference speed.

Definitely agree with this though :slight_smile:

@ChrisRackauckas, do you think it’s a good idea to remove gotcha #7 given that it isn’t needed for versions 1.0 and above. If you don’t want to remove it, could you consider adding a comment in the main text itself?

I became aware of not needing to do the steps in #7 through the comments on your blog post but it’s probably useful to make a small comment on the main text too.

What are your thoughts?

Probably. The post is like 3 years old now, so there’s a few other things that would need an update.

1 Like

Yeah, I guess that would be nice too. I’m particularly fond of this blog post and regularly suggest it to newcomers on the forum. I personally learnt a lot from it when I started ~10 months ago. An update on some other things to make the post more relevant for newer versions would be an amazing public good!