Question on Type Inference with Anonymous Functions and Broadcast

Hello!

I have a question. Consider this function.

f(x) = exp(2 * x)
x = range(0, 1, 2^5)

If I want to pass x into it I would do f.(x) and if I do @code_warntype I can see everything is inferred.

Body::Vector{Float64}
1 ─ %1 = Base.broadcasted(Main.f, x1)::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Nothing, typeof(f), Tuple{StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64}}}
│   %2 = Base.materialize(%1)::Vector{Float64}
└──      return %2

If I create an anonymous function from f though and try to broadcast I get type Any.

h = x -> f(x)
@code_warntype h.(x)

The result is

Body::Any
1 ─ %1 = Base.broadcasted(Main.h, x1)::Any
│   %2 = Base.materialize(%1)::Any
└──      return %2

I would appreciate any hints on what is happening here with broadcasting to cause the Any.

julia> x = range(0, 1, 2^5)
0.0:0.03225806451612903:1.0

julia> f(x) = exp(2 * x)
f (generic function with 1 method)

julia> @code_warntype (x -> f(x)).(x)
MethodInstance for (::var"##dotfunction#291#2")(::StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64})
  from (::var"##dotfunction#291#2")(x1) in Main
Arguments
  #self#::Core.Const(var"##dotfunction#291#2"())
  x1::StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64}
Locals
  #1::var"#1#3"
Body::Vector{Float64}
1 ─      (#1 = %new(Main.:(var"#1#3")))
│   %2 = #1::Core.Const(var"#1#3"())
│   %3 = Base.broadcasted(%2, x1)::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Nothing, var"#1#3", Tuple{StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64}}}
│   %4 = Base.materialize(%3)::Vector{Float64}
└──      return %4


julia> const h = x -> f(x)
#4 (generic function with 1 method)

julia> @code_warntype h.(x)
MethodInstance for (::var"##dotfunction#292#6")(::StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64})
  from (::var"##dotfunction#292#6")(x1) in Main
Arguments
  #self#::Core.Const(var"##dotfunction#292#6"())
  x1::StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64}
Body::Vector{Float64}
1 ─ %1 = Base.broadcasted(Main.h, x1)::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Nothing, var"#4#5", Tuple{StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64}}}
│   %2 = Base.materialize(%1)::Vector{Float64}
└──      return %2

so as you can see the problem is that in your code the binding h = x -> f(x) is not a constant (the type of value bound to h might change).

In particular, it’s a non-constant global variable, which is always slow in Julia—this is the very first performance tip in the manual.

If you use anonymous functions locally inside a function or other local scope, they are just fine.

1 Like

Yes good points. Totally missed that that was the issue here.