When does julia infer optimal code, when not?

Dear julia pros,

lets say I have a basic function fn(i::Integer) = 5 which and another function gn(i::Integer) = ...

what is all allowed in gn so that

@code_llvm fn(3)
@code_llvm gn(3)

return very similar results, i.e. gn is reduced to very simple code?

thanks a lot!

some examples

works

function g(a::Integer)
  a = 0
  for i ∈ 1:4
    a += i
  end
  a
end

@code_llvm g(1)

returns

; Function g
; Location: D:\ProjectsJulia\Traits\test\tmp.jl:68
; Function Attrs: uwtable
define i64 @julia_g_36250(i64) #0 {
top:
; Location: D:\ProjectsJulia\Traits\test\tmp.jl:72
  ret i64 10
}

which optimizes the whole loop away

does not work

other times it does not work, for instance

function g2(a::Integer)
  methods(reduce)
  4
end

@code_llvm g2(1)

gives

; Function g2
; Location: D:\ProjectsJulia\Traits\test\tmp.jl:72
; Function Attrs: uwtable
define i64 @julia_g2_36717(i64) #0 {
top:
  %1 = alloca %jl_value_t addrspace(10)*, i32 2
; Function methods; {
; Location: reflection.jl:769
  %2 = getelementptr %jl_value_t addrspace(10)*, %jl_value_t addrspace(10)** %1, i32 0
  store %jl_value_t addrspace(10)* addrspacecast (%jl_value_t* inttoptr (i64 21682165440 to %jl_valu
e_t*) to %jl_value_t addrspace(10)*), %jl_value_t addrspace(10)** %2
  %3 = getelementptr %jl_value_t addrspace(10)*, %jl_value_t addrspace(10)** %1, i32 1
  store %jl_value_t addrspace(10)* addrspacecast (%jl_value_t* inttoptr (i64 77858240 to %jl_value_t
*) to %jl_value_t addrspace(10)*), %jl_value_t addrspace(10)** %3
  %4 = call nonnull %jl_value_t addrspace(10)* @japi1_methods_35966(%jl_value_t addrspace(10)* addrs
pacecast (%jl_value_t* inttoptr (i64 21689050048 to %jl_value_t*) to %jl_value_t addrspace(10)*), %j
l_value_t addrspace(10)** %1, i32 2)
;}
; Location: D:\ProjectsJulia\Traits\test\tmp.jl:73
  ret i64 4
}

You might find some answers to your questions in this video from Juliacon 2018: Information Overload: tools for making program analysis and debugging manageable

2 Likes

The question is not well-posed, as there are many gn variations which the compiler would reduce β€” basically anything that is independent of i and constant propagation works.

That said, while constant propagation is very useful optimization, IMO it is not the most important thing you should care about if you want to write performant Julia code for anything that is of medium complexity. When the compiler fails to figure out the type of value, you usually pay a much higher price. Looking at @code_warntype is much more useful for debugging this.

You may also find the following libraries useful:

1 Like

thanks a lot for the references!

I hope I will find time to look through all of them in detail.

For now I seem to be able to guarantee optimal code by using @generated functions for all inference steps which by itself don’t work