# How to deal with Vector{Function} for type inference?

I have the following two functions:

``````function foo1!(v::Vector{Float64}, b::Vector{Float64})
for i in eachindex(v)
v[i] = 2 * v[i] + 3 * b[i]
end
end

function foo2!(v::Vector{Float64}, b::Vector{Float64})
for i in eachindex(v)
v[i] = 10 * v[i]^2 - 8 * b[i]
end
end
``````

I want to randomly apply one of these. Two slightly different ways:

``````function barA()
v = rand(100)
b = rand(100)

if rand() < 0.5
foo1!(v, b)
else
foo2!(v, b)
end
end

function barB()
foos = Function[foo1!, foo2!]

v = rand(100)
b = rand(100)

i = rand(1:2)
foo!::Function = foos[i]
foo!(v, b)
end
``````

`barA()` has no problem. But with `barB()`:

``````@code_warntype barB()
``````

Is there any way to address the type stability of `barB()`?

Check out FunctionWrappers.jl.

1 Like

Note that in this case both barA() and barB() return `nothing`… if you add `return v` before closing the function, both implementations are type stable.
Although you still have type instability in the function vector, I would first check if it is really a problem in your implementation before worrying too much about it…

Thanks!

Just in case anyone wondered, here is what I did:

``````using FunctionWrappers
import FunctionWrappers: FunctionWrapper

function foo1!(v::Vector{Float64}, b::Vector{Float64})
for i in eachindex(v)
v[i] = 2 * v[i] + 3 * b[i]
end
end

function foo2!(v::Vector{Float64}, b::Vector{Float64})
for i in eachindex(v)
v[i] = 10 * v[i]^2 - 8 * b[i]
end
end

struct TypeStableStruct
func::FunctionWrapper{Nothing, Tuple{Vector{Float64}, Vector{Float64}}}
end

const foo1_ = TypeStableStruct(foo1!)
const foo2_ = TypeStableStruct(foo2!)

evaluate_func(f::TypeStableStruct, a, b) = f.func(a, b)

function barC()

v = zeros(4)
b = ones(4)

foos_ = TypeStableStruct[foo1_, foo2_]

i = rand(1:2)
foo_! = foos_[i]
evaluate_func(foo_!, v, b)
end

@code_warntype barC()

``````
1 Like

Thanks for the comment. I am trying to speed up a large code and wanted to make it sure that everything is type-stable, so that I can focus on other part of code optimization with peace of mind. 