The Performance Tips discuss the drawbacks of capturing global variables inside of functions. Are there any performance penalties for capturing functions inside of other functions? For example,
function predict(x)
# Returns some value
end
function loss(x, label)
return sum(abs2, predict(x) .- label)
end
Will capturing predict() in loss() result in any performance penalties?
(I’m aware that functions are const by default, but wanted to see if there were other considerations to be aware of.)
No, there isn’t. If there were, then the functions sum and abs2 and - would have the same problem, as would every other function in Base, and all other libraries too.
That is a bit unfortunate. I can appreciate why closures lead to boxing and hinder proper type inference. While it makes sense to me that predict = x -> x^2 formally creates a closure, the return type is perfectly inferrable from x alone, isn’t it? Then why does inference on loss struggle here?
Well, yes but what predict means is not predictable since anyone can just go and do global predict = x -> string(x) and now all of a sudden the return type is different.
So it seems like the main issue is that the variable holding the function needs to be const, correct? The documentation mentions that function and struct declarations automatically make the assignments const, but when doing predict = x -> x^2 the variable predict is dynamic.
Correct me if I’m wrong, but I think const is still needed even when predict is defined inside a function? For example,
function makepredict()
predict = x -> x^2
return predict
end
function g1(x)
return x^2
end
g2 = makepredict()
const g3 = makepredict()
@btime for i=1:1000 g1(10) end # 0.877 ns (0 allocations: 0 bytes)
@btime for i=1:1000 g2(10) end # 6.766 μs (0 allocations: 0 bytes)
@btime for i=1:1000 g3(10) end # 0.877 ns (0 allocations: 0 bytes)