JuMP - Nested Optimization with user defined functions

Hi!

I have a Nested Optimization problem that I am trying to solve using JuMP. Inside, I have a linear problem with power constraints where I use JuMP + Mosek to solve. Outside, I have a concave problem defined on a set of linear constraints. The inner problem gives me both the value of the function conditional on the variable I am optimizing in the outer problem, and its gradients.

Ipopt seems like the natural optimizer for this problem. However, I have been using NLOpt because I cannot get the user-defined function to work.

This is not a MWE, but I am trying to do something like this:

using Ipopt, JuMP

@everywhere function f1_val(data::Data, x::Vector{Float64}, n::Int)
    Eπ, Eν = 0.0, zeros(data.I)
    for s in 1:data.Sexp
        temp = f2(data, x, s, n)
        Eπ += temp[1]
        Eν .+= temp[2]
    end
    return Eπ / data.Sexp - sum(data.α[:,n]  .*x)
end

@everywhere function f1_grad(data::Data, x::Vector{Float64}, n::Int)
    Eπ, Eν = 0.0, zeros(data.I)
    for s in 1:data.Sexp
        temp = f2(data, x, s, n)
        Eπ += temp[1]
        Eν .+= temp[2]
    end
    return  Eν ./ data.Sexp .- data.α[:,n] 
end

I woiuld like to find the vector x that maximizes f1_val.

I don’t fully understand your example because I don’t know what f2 ore the data are, but here’s an example of nested optimization problems via user-defined functions:

using JuMP
import Ipopt

function bilevel_problem()
    last_x, last_f, last_y = nothing, 0.0, [NaN, NaN]
    function _update(x...)
        if last_x == x
            return last_y
        end
        model = Model(Ipopt.Optimizer)
        set_silent(model)
        @variable(model, y[1:2])
        @NLobjective(
            model,
            Max,
            y[1] * x[1]^2 + y[2] * x[2]^2 - x[1] * y[1]^4 - 2 * x[2] * y[2]^4,
        )
        @constraint(model, (y[1] - 10)^2 + (y[2] - 10)^2 <= 25)
        optimize!(model)
        @assert termination_status(model) == LOCALLY_SOLVED
        last_x = x
        last_f = objective_value(model)
        last_y[1] = value(y[1])
        last_y[2] = value(y[2])
        return last_y
    end
    function f(x...)
        _update(x...)
        return last_f
    end
    function ∇f(g, x...)
        _update(x...)
        g[1] = 2 * x[1] * last_y[1] - last_y[1]^4
        g[2] = 2 * x[2] * last_y[2] - 2 * last_y[2]^4
        return
    end
    return _update, f, ∇f
end

function bench_bilevel_nlp()
    model = Model(Ipopt.Optimizer)
    set_optimizer_attribute(model, "tol", 1e-5)
    @variable(model, x[1:2] >= 0)
    subproblem, f, ∇f = bilevel_problem()
    register(model, :bilevel_f, 2, f, ∇f)
    @NLobjective(model, Min, bilevel_f(x[1], x[2]) + x[1]^2 + x[2]^2)
    optimize!(model)
    @assert termination_status(model) == LOCALLY_SOLVED
    println(objective_value(model))
    println("x = ", value.(x))
    println("y = ", subproblem(value.(x)...))
    return
end