This question is related to this old post, but I’m wondering if JuMP can now handle vector-valued nonlinear functions for constraints. I have a large scale problem where a nonlinear function of the variables x
, multiplied by a matrix A
yields another vector output. I need all elements of this vector to be nonnegative.
Here is a simple example:
A = [1.1 2.2; 3.3 4.4]
function constr(x::T...) where {T<:Real}
transformed = 1 ./ (1 .+ x)
# 2x1 vector output
lhs = A * transformed
return lhs
end
function obj(x::T...) where {T<:Real}
transformed = 1 ./ (1 .+ x)
lhs = transformed
return var(lhs)
end
model = Model(Ipopt.Optimizer)
@variable(model, 0 <= x[1:2] <= 1)
@constraint(model, budget, sum(x) == 1)
constr
takes in the vector of x
variables and returns a vector. I’d like all elements of this vector to be constrained to be non-negative. I’ve tried implementing this with the following:
for i in 1:2
register(model, Symbol("constr_$i"), 2, x -> constr(x)[i], autodiff = true)
@NLconstraint(model, constr(shares...)[i] >= 0)
end
but I get the following error:
MethodError: no method matching (::var"#27#28"{Int64})(::ForwardDiff.Dual{ForwardDiff.Tag{MathOptInterface.Nonlinear.var"#15#16"{var"#27#28"{Int64}}, Float64}, Float64, 2}, ::ForwardDiff.Dual{ForwardDiff.Tag{MathOptInterface.Nonlinear.var"#15#16"{var"#27#28"{Int64}}, Float64}, Float64, 2})
The rest of the code that yields a workable example if you drop the constr
part:
register(model, :obj, 2, obj, autodiff = true)
@NLobjective(model, Min, obj(x...))
@time JuMP.optimize!(model)