JuMP with nonlinear vector constraints?

This question is related to this old post, but I’m wondering if JuMP can now handle vector-valued nonlinear functions for constraints. I have a large scale problem where a nonlinear function of the variables x, multiplied by a matrix A yields another vector output. I need all elements of this vector to be nonnegative.

Here is a simple example:

A = [1.1 2.2; 3.3 4.4]

function constr(x::T...) where {T<:Real}

    transformed = 1 ./ (1 .+ x)

    # 2x1 vector output
    lhs = A * transformed

    return lhs

end

function obj(x::T...) where {T<:Real}

    transformed = 1 ./ (1 .+ x)

    lhs = transformed

    return var(lhs)

end


model = Model(Ipopt.Optimizer)

@variable(model, 0 <= x[1:2] <= 1)
@constraint(model, budget, sum(x) == 1)

constr takes in the vector of x variables and returns a vector. I’d like all elements of this vector to be constrained to be non-negative. I’ve tried implementing this with the following:

for i in 1:2
    register(model, Symbol("constr_$i"), 2, x -> constr(x)[i], autodiff = true)
    @NLconstraint(model, constr(shares...)[i] >= 0)
end

but I get the following error:

 MethodError: no method matching (::var"#27#28"{Int64})(::ForwardDiff.Dual{ForwardDiff.Tag{MathOptInterface.Nonlinear.var"#15#16"{var"#27#28"{Int64}}, Float64}, Float64, 2}, ::ForwardDiff.Dual{ForwardDiff.Tag{MathOptInterface.Nonlinear.var"#15#16"{var"#27#28"{Int64}}, Float64}, Float64, 2})

The rest of the code that yields a workable example if you drop the constr part:

register(model, :obj, 2, obj, autodiff = true)

@NLobjective(model, Min, obj(x...))
@time JuMP.optimize!(model)

Hi @irudik, welcome to the forum.

Your issue is that your constr function doesn’t work:

julia> A = [1.1 2.2; 3.3 4.4]
2×2 Matrix{Float64}:
 1.1  2.2
 3.3  4.4

julia> function constr(x::T...) where {T<:Real}
           transformed = 1 ./ (1 .+ x)
           # 2x1 vector output
           lhs = A * transformed
           return lhs
       end
constr (generic function with 1 method)

julia> constr(1.0, 1.0)
ERROR: MethodError: no method matching *(::Matrix{Float64}, ::Tuple{Float64, Float64})

It’ll work if you collect x into a vector:

julia> A = [1.1 2.2; 3.3 4.4]
2×2 Matrix{Float64}:
 1.1  2.2
 3.3  4.4

julia> function constr(x::T...) where {T<:Real}
           x = collect(x)
           transformed = 1 ./ (1 .+ x)
           # 2x1 vector output
           lhs = A * transformed
           return lhs
       end
constr (generic function with 1 method)

julia> constr(1.0, 1.0)
2-element Vector{Float64}:
 1.6500000000000001
 3.85

But that’s still going to be quite an inefficient way to model the problem in JuMP. You’re better off with:

using JuMP, Ipopt, Statistics
A = [1.1 2.2; 3.3 4.4]
model = Model(Ipopt.Optimizer)
@variable(model, 0 <= x[1:2] <= 1)
@variable(model, y[1:2])
@NLconstraint(model, [i=1:2], y[i] == 1 / (1 + x[i]))
@objective(model, Min, Statistics.var(y))
@constraint(model, budget, sum(x) == 1)
@constraint(model, A * y .>= 0)
optimize!(model)

Hi @odow,

Thanks so much, this was super helpful and my actual problem solves now. I appreciate the advice on the more efficient setup!

1 Like