Calling a simple method with unused type parameters unexpectedly allocates memory

Hello,

I am trying to understand why the calling of a relatively simple method with unused type parameters allocates memory, while the same method with no type parameters does not.

I have a method that is aiming at computing a Gramian matrix from a vector basis. The methods has three arguments: two singleton types and one matrix. I am giving here below a simplify version of this method:

using BenchmarkTools

struct TVar1 end
struct TVar2 end
struct TVec{Var, Sign} end

function foo(arg1::TVec{Var1, Sign1}, arg2::TVec{Var2, Sign2}, gram_matrix) where {Var1, Sign1, Var2, Sign2}
    gram_matrix[1, 2] = 20.
end

function dummy_assignment(M)
    gram_matrix = Array{Float64, 2}(undef, 2, 2)
    fill!(gram_matrix, 0.1)
    basis = [TVec{TVar1(), 1}(); TVec{TVar2(), -1}()]
    for k = 1:M
        for i = 1:length(basis)
            for j = 1:length(basis)
                foo(basis[i], basis[j], gram_matrix)
            end
        end
    end
end

Launching a time benchmark:

@btime dummy_assignment(10000)

gives me: 937.600 μs (40002 allocations: 625.20 KiB)

If, from the method I am deleting the unused type parameters:

function foo(arg1::TVec, arg2::TVec, gram_matrix)
    gram_matrix[1, 2] = 20.
end

the same time benchmarking gives: 75.200 μs (2 allocations: 208 bytes)

I’ve also checked the method with @code_warntype for possible type instabilities but everything seems to be ok.

Could you please help me understand why Julia is allocating memory in this particular case?
Thank you for your help.

Kind regards,
Alexandru

1 Like

My guess is that since basis[i] and basis[j] are determined dynamically in the loop, if the type signature is that specific the dispatch is occurring at runtime. In the second case it just has to create one method to dispatch.

Thank you for your insight. I must say that I have a tendency to always specify the types of method arguments or composite fields. I am most certainly biased, since my day to day work is done in C++. But I see now that, constraining too much the compiler, can trigger negative effects.

Do you have some references for best practice guidelines on how to optimally use and specify types in Julia / on how to avoid dispatching at runtime? It will be really helpful for me. Thank you.

Maybe related, this section on specialization: Performance Tips · The Julia Language

1 Like

This discussion and this issue might be related, but they are about completely unused type parameters, whereas your type parameters are all attached to the args.

However I’d say it’s good practice to boil these annotations down to the absolute minimum you need for dispatch.

Please don’t take it to seriously. I reached a fairly complicated example there. To really understand what is going on one would need to inspect the lowered codes (which I am not very good at).

In any case, if that part of the code is critical for performance, you should avoid containers (vectors, arrays) of mixed types. That will almost always degrade performance because of the runtime dispatch that it might imply, unless the situation is simple enough that the compiler can figure out and split.

For example, in that specific case, if you help the compiler by initializing the vector as a union of types, you get rid of the allocations:

julia> function dummy_assignment(M)
           gram_matrix = Array{Float64, 2}(undef, 2, 2)
           fill!(gram_matrix, 0.1)
           basis = Union{TVec{TVar1(),1},TVec{TVar2(),-1}}[TVec{TVar1(), 1}(); TVec{TVar2(), -1}()]
           for k = 1:M
               for i = 1:length(basis)
                   for j = 1:length(basis)
                       foo(basis[i], basis[j], gram_matrix)
                   end
               end
           end
       end
dummy_assignment (generic function with 1 method)

julia> @btime dummy_assignment(10000)
  87.805 μs (2 allocations: 208 bytes)


But that will only work if the number of different types is small. Otherwise the compiler will give up.