Are there ways to preallocate memory via structs for JuMP autodiff?

I’m very aware of the limitations of ForwardDiff.jl that code has to be written general enough to accept {T<:Real}. The JuMP docs suggest to write target functions like

function good_f(x::T...) where {T<:Real}
    y = zeros(T, length(x))  # Construct an array of type `T` instead!
    for i = 1:length(x)
        y[i] = x[i]^i
    end
    return sum(y)
end

I want to try to optimize a huge system where the target function will depend on varying combination of structs where type and needed allocations are known before optimization. Therefore, preallocation would be highly desired, e.g. I want to achieve something like:


struct PreAlloc
    y::Vector{<:Real}
end

Y = PreAlloc(zeros(T, length(x)))

function good_f(x::T, Y::PreAlloc) where {T<:Real}
    for i = 1:length(x)
        Y.y[i] = x[i]^i
    end
    return sum(Y.y)
end

Now this will fail to allocate into Y.y because the types won’t match during the autodiff pass. I know that SciML solves this problem via PreallocationTools.jl.

Is there a way to achieve something similar within the JuMP framework?

1 Like

This is side stepping your question a little bit, but why not just put your expression into JuMP directly? Like this,

@NLobjective(m, Min, sum(x[i]^i for i in 1:length(x)))

In this case JuMP will handle the memory for you and it is quite efficient about it.

I haven’t tried, but presumably you can use PreallocationTools directly. There’s nothing special about JuMP’s use of ForwardDiff.

You could always just do something like this:

julia> cache = Dict{Any,Any}()
Dict{Any, Any}()

julia> function good_f(x::T...) where {T}
           if !haskey(cache, T)
               cache[T] = zeros(T, length(x))
           end
           y = cache[T]::Vector{T}
           for i = 1:length(x)
               y[i] = x[i]^i
           end
           return sum(y)
       end
good_f (generic function with 1 method)

julia> 

julia> good_f(1, 2)
5

julia> good_f(1.0, 2.0)
5.0

julia> cache
Dict{Any, Any} with 2 entries:
  Int64   => [1, 4]
  Float64 => [1.0, 4.0]

julia> good_f(1.0, 3.0)
10.0

julia> cache
Dict{Any, Any} with 2 entries:
  Int64   => [1, 4]
  Float64 => [1.0, 9.0]
1 Like