Using feasRelax in Julia with Gurobi and JuMP

Hello to everyone!
I have an optimization problem that, according to some input data, can be unfeasible. I would like to check the termination status and, in case of infeasibility, try to solve it relaxing some constraints. For this purpose I am trying to use the feasRelax function of Gurobi.
I read this issue in Github, where it is described the correct use of this function in Julia, but I still have some problems. In particular, I do not know how to access to the results of the relaxation and I do not know if the relaxation is performed as I am expecting.
Here is my minimal working example:

using JuMP, Gurobi
jump_model = direct_model(Gurobi.Optimizer())
model = backend(jump_model)

@variable(jump_model, x[1:5])

@objective(jump_model, Min, sum(x))

@constraint(jump_model, c1, x .>= 1.0)
@constraint(jump_model, c2, x .<= 2.0)
@constraint(jump_model, c3, x[1] .<= -1.0) # constraint that generates unfeasibility

JuMP.optimize!(jump_model)
println(JuMP.termination_status(jump_model))

pnlt = 2.0
error = GRBfeasrelax(model, 0, true, [], [], [c3], [pnlt])

x_res = JuMP.value.(x) # provides error
res_val = JuMP.objective_value(jump_model) # provides error

The reason why I am not sure if the relaxation is correctly performed is that the GRBfeasrelax returns 0 for any value of penalty that I use.
Let me know if anyone has some ideas, thank you!

Per the documentation, after relaxing the problem, you need to call GRBoptimize to compute a new solution: https://www.gurobi.com/documentation/10.0/refman/c_feasrelax.html

Here are two examples:

using JuMP
using Gurobi
model = direct_model(Gurobi.Optimizer())
@variable(model, 1 <= x[1:5] <= 2)
@objective(model, Min, sum(x))
@constraint(model, c, x[1] <= -1)
optimize!(model)
grb_model = backend(model)
rhspen = Cdouble[2.0]
feasobjP = Ref{Cdouble}()
GRBfeasrelax(grb_model, 0, 1, C_NULL, C_NULL, rhspe, feasobjP)
feasobjP[]
GRBoptimize(grb_model)
value.(x)

model = direct_model(Gurobi.Optimizer())
@variable(model, 1 <= x[1:5] <= 2)
@objective(model, Min, sum(x))
@constraint(model, c, x[1] <= -1)
optimize!(model)
grb_model = backend(model)
lbpen = fill(GRB_INFINITY, 5)
lbpen[1] = 1.0
feasobjP = Ref{Cdouble}()
GRBfeasrelax(grb_model, 0, 1, lbpen, C_NULL, C_NULL, feasobjP)
feasobjP[]
GRBoptimize(grb_model)
value.(x)
2 Likes

Thank you very much @odow, this is exactly what I was looking for!

However, I receive an error message that prevents the execution of the relaxed optimization.
In my code, when I address grb_model = backend(model_LL) the output is:

MOIU.CachingOptimizer{MOIB.LazyBridgeOptimizer{Gurobi.Optimizer}, MOIU.UniversalFallback{MOIU.Model{Float64}}}
in state ATTACHED_OPTIMIZER
in mode AUTOMATIC
with model cache MOIU.UniversalFallback{MOIU.Model{Float64}}
  with 1 optimizer attribute
  fallback for MOIU.Model{Float64}
with optimizer MOIB.LazyBridgeOptimizer{Gurobi.Optimizer}
  with 0 variable bridges
  with 0 constraint bridges
  with 0 objective bridges
  with inner model     sense  : minimize
    number of variables             = 523
    number of linear constraints    = 7328
    number of quadratic constraints = 0
    number of sos constraints       = 0
    number of non-zero coeffs       = 66252
    number of non-zero qp objective terms  = 0
    number of non-zero qp constraint terms = 0

Then, when I execute GRBfeasrelax(grb_model, 1, 1, C_NULL, C_NULL, rhspen, feasobjP) the error returned is:

ERROR: MethodError: no method matching unsafe_convert(::Type{Ptr{Nothing}}, ::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Gurobi.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}})
Closest candidates are:
  unsafe_convert(::Union{Type{Ptr{Nothing}}, Type{Ptr{Base.Libc.FILE}}}, ::Base.Libc.FILE) at C:\Users\umbe\AppData\Local\Programs\Julia-1.7.3\share\julia\base\libc.jl:94    
  unsafe_convert(::Type{Ptr{T}}, ::Base.ReshapedArray{T}) where T at C:\Users\umbe\AppData\Local\Programs\Julia-1.7.3\share\julia\base\reshapedarray.jl:281
  unsafe_convert(::Type{Ptr{T}}, ::SharedArrays.SharedArray{T}) where T at C:\Users\umbe\AppData\Local\Programs\Julia-1.7.3\share\julia\stdlib\v1.7\SharedArrays\src\SharedArrays.jl:361
  ...
Stacktrace:
 [1] GRBfeasrelax(model::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Gurobi.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, relaxobjtype::Int64, minrelax::Int64, lbpen::Ptr{Nothing}, ubpen::Ptr{Nothing}, rhspen::Vector{Float64}, feasobjP::Base.RefValue{Float64})     
   @ Gurobi C:\Users\umbe\.julia\packages\Gurobi\FliRK\src\gen91\libgrb_api.jl:308
 [2] top-level scope
   @ none:1

I am trying to figure out what to do to solve it, but without success. What is going wromg here?

Ok, I have found the issue, the model was not defined properly in my original code. I used model_LL = Model(Gurobi.Optimizer) instead of model_LL = direct_model(Gurobi.Optimizer()), but now I have changed it and it works! Thanks again.

1 Like