Gurobi and non linear

Hi,

the (python) inteface of Gurobi, since 9.0 ver, enables to consider non linear/non convex OF/cns. Notably square roots of variables in the form

\sum_i \sqrt(z_i) \leq k

. This by means of the function that let you use y=x^a generic power

addGenConstrPow

Link to addGenConstrPow

I tried to model an opt problem with non linear jump cns in the form

@NLconstraint(model, sqrt(z1)+sqrt(z2)+… <= k)

but i get the error

ERROR: LoadError: The solver does not support nonlinear problems (i.e., NLobjective and NLconstraint).

Is there any workaround?

Thanks in adance
Fabrizio

Not specific to Gurobi (so it may not solve your problem) but the @NL... syntax is deprecated now since JuMP v1.15.

See the new API at Nonlinear Modeling · JuMP

Thanks @gdalle for pointing out this, it’s my first experience with non linear cns and jump. However it does not solve the problem. The issue should be related to the gurobi.jl present interface as pointed out in here

I think you can use Gurobi.jl/src/gen110/libgrb_api.jl at 69d2d6e70ab517751275bea0d4ec0acdb263299f · jump-dev/Gurobi.jl · GitHub

From the Readme.md

using JuMP, Gurobi
column(x::VariableRef) = Gurobi.c_column(backend(owner_model(x)), index(x))
model = direct_model(Gurobi.Optimizer())
@variable(model, x[i in 1:2])
@variable(model, y[1:2])
GRBaddgenconstrPow(backend(model), "x1^0.7", column(x[1]), column(y[1]), 0.7, "")
GRBaddgenconstrPow(backend(model), "x2^3", column(x[2]), column(y[2]), 3.0, "")
@objective(model, Min, y[1] + y[2])
optimize!(model)
1 Like

Thanks @ohmsweetohm1, this seems a viable possibility, going down to the C-level API, but i am not sure on how to generalize to a larger sum of the form

\sum_i(\sqrt(y_i) \leq k

I don’t know if you have experience working with these types of non cnx constraints on large models(?)

Regards
Fabrizio

using JuMP, Gurobi

N = 2
column(x::VariableRef) = Gurobi.c_column(backend(owner_model(x)), index(x))
model = direct_model(Gurobi.Optimizer())
@variable(model, x[i in 1:N])
@variable(model, y[i in 1:N])
for i in 1:N
    GRBaddgenconstrPow(backend(model), "x$i^0.5", column(x[i]), column(y[i]), 0.5, "")
end
@constraint(model, sum(y) <= k)
@objective(model, Min, k) # something 
optimize!(model)
3 Likes

Some additional context: in Gurobi v9 and v10, non-convex constraints other than quadratic were supported in the API, but handled internally via a piece-wise linear approximation. In v11, Gurobi added the option to handle these explicitly as non-convex constraints, as opposed to performing a piece-wise linear approximation. Note that non-convex quadratic constraints were fully supported since v9.

Gurobi’s nonconvex API supports a finite number of pre-defined functions (like power, sin/cos, exponential, logarithm, etc.). AFAIK JuMP does not support passing, e.g., the constraint y = \sqrt{x} to Gurobi because that would require a specialized API call, whereas JuMP handles nonlinear expression/constraints in a unified way.
This means that, in order to pass y = \sqrt{x} to Gurobi, you need to use the C API.

That being said, if constraints of the form \sum_{i} \sqrt{x_{i}} \leq k is what you’re after, an alternative approach is to formulate it as \sum_{i} y_{i} \leq k, y_{i}^{2} = x_{i}, y_{i} \geq 0. The first constraint is linear, the second is a non-convex quadratic, and the last is a variable bound. You can formulate all 3 in JuMP and hand it to Gurobi, without needing any C API hack.
The corresponding Julia code would be (I added bounds on x, y):

using JuMP, Gurobi

N = 2

model = direct_model(Gurobi.Optimizer())
@variable(model, x[i in 1:N] >= 0)
@variable(model, y[i in 1:N] >= 0)

# y == sqrt(x)  <=> (y^2 == x, y >= 0)
@constraint(model, [i in 1:n], y[i] * y[i] == x[i])

@constraint(model, sum(y) <= k)
@objective(model, Min, k) # something 
optimize!(model)
3 Likes

Thanks @mtanneau, the reformulations you gave is of course correct, i was just curious from a JuMP standpoint if the original version was manageable. As you say the API should be specialized and this is probably not so easy

2 Likes