Which JuMP solvers support compute_conflict!?

I am trying to show the constraints of an infeasible problem in a human-readable format and found compute_conflict! in the docs:

Solutions · JuMP!

The COSMO.jl solver doesn’t support the function. Is there any other pure Julia solver that supports the function? My optimization problem is very simple (linear objective + linear/interval constraints).

Essentially just cplex and gurobi. Its a niche feature.

Thank you @odow for the quick reply. Can you elaborate on what it does exactly? I understand that it shows all constraints that “conflict” with each other causing the infeasibility of the problem, is that right?

Do you have suggestions on how to report to end users the constraints that are problematic? I also found primal_feasibility_report(model) but it returns a long list of constraints and the excesses for each. I wonder if there is a simple function that can suggest users to relax a few constraints of the problem.

Its a heuristic that removes contraints and variables while keeping the model infeasible. The resulting system can still be very large, and might be one of many possible systems that could have been returned.

You probably want instead Debugging · JuMP

1 Like

@mtanneau helped to write a function that returns all violations of an infeasible model:

function violations(model, tol=1e-4)
	res = []
	if termination_status(model) == INFEASIBLE &&
	   dual_status(model) == INFEASIBILITY_CERTIFICATE
		for (F, S) in list_of_constraint_types(model)
			cons = all_constraints(model, F, S)
			for i in eachindex(cons)
				if isassigned(cons, i)
					con = cons[i]
					if abs(dual(con)) > tol
						push!(res, con)
					end
				end
			end
		end
	end
	res
end

This function works well with the Tulip.jl and COSMO.jl solvers, but it doesn’t work with HiGHS.jl which is the solver that was more stable for my application.

Any idea on how to make it work with HiGHS.jl as well? Any help is appreciated.

Here is a MWE model to test.

HiGHS can prove infeasibility in presolve, which means it doesn’t find an infeasibility certificate. If you turn off presolve, it can find an infeasibility certificate.

julia> optimize!(model)
Running HiGHS 1.5.1 [date: 1970-01-01, git hash: 93f1876e4]
Copyright (c) 2023 HiGHS under MIT licence terms
Presolving model
Problem status detected on presolve: Infeasible
Model   status      : Infeasible
Objective value     :  0.0000000000e+00
HiGHS run time      :          0.00
ERROR:   No invertible representation for getDualRay

julia> solution_summary(model)
* Solver : HiGHS

* Status
  Result count       : 0
  Termination status : INFEASIBLE
  Message from the solver:
  "kHighsModelStatusInfeasible"

* Candidate solution (result #1)
  Primal status      : NO_SOLUTION
  Dual status        : NO_SOLUTION
  Objective bound    : 0.00000e+00
  Relative gap       : Inf

* Work counters
  Solve time (sec)   : 2.24088e-04
  Simplex iterations : 0
  Barrier iterations : 0
  Node count         : -1


julia> set_attribute(model, "presolve", "off")

julia> optimize!(model)
Solving LP without presolve or with basis
Using EKK dual simplex solver - serial
  Iteration        Objective     Infeasibilities num(sum)
          0    -9.0000000000e+03 Ph1: 20(22998); Du: 9(9) 0s
         26     6.0000000000e+02 0s
Model   status      : Infeasible
Simplex   iterations: 26
Objective value     :  6.0000000000e+02
HiGHS run time      :          0.00

julia> solution_summary(model)
* Solver : HiGHS

* Status
  Result count       : 1
  Termination status : INFEASIBLE
  Message from the solver:
  "kHighsModelStatusInfeasible"

* Candidate solution (result #1)
  Primal status      : NO_SOLUTION
  Dual status        : INFEASIBILITY_CERTIFICATE
  Objective value    : 6.00000e+02
  Objective bound    : 0.00000e+00
  Relative gap       : Inf
  Dual objective value : 1.00000e+02

* Work counters
  Solve time (sec)   : 3.47093e-03
  Simplex iterations : 26
  Barrier iterations : 0
  Node count         : -1

A few other comments:

I posted this above, but it seems that you probably want to use relax_with_penalty! instead: Debugging · JuMP

Your modeling can be improved. Instead of adding bounds using @constraint, add them in the @variable macro. So instead of:

@variable(model, Vp[pset])
@constraint(model, [0.,0.] .≤ Vp .≤ [100.,100.])

do

@variable(model, 0 <= Vp[pset] <= 100)

The former adds two new rows to the constraint matrix, and leaves the variables unbounded. The latter explicitly sets the variable bounds, which the solver has special support for.

Also, terms like rcosts = sum(Vr[r] for r in rset) can be slow to construct outside the macros. You’re better off calling @objective(model, Min, sum(Vr) + sum(Vb) - sum(Vc)).

1 Like

Thanks for the advise :slight_smile: In my real case I have a pset that is a vector of tuples of strings. I then have two dictionaries mapping these tuples of strings to lower and upper bounds:

pmin = Dict(zip(ids, prods) .=> mins)
pmax = Dict(zip(ids, prods) .=> maxs)
@variable(model, Vp[pset])
for (id, prod) in pset
	@constraint(model, pmin[(id,prod)] ≤ Vp[(id,prod)] ≤ pmax[(id,prod)])
end

How would you rewrite it to use the bounds directly in the @variable macro?

I tried copying the example in the docs but the resulting lines all showed violations = 1.0, which is not what I would expect from the bounds and values of the optimization variables. Maybe I am misinterpreting what is being printed?

Can you please clarify what the presolve option does? Is it safe to turn it off in general?

How would you rewrite it to use the bounds directly in the @variable macro?

pmin = Dict(zip(ids, prods) .=> mins)
pmax = Dict(zip(ids, prods) .=> maxs)
@variable(model, pmin[(id,prod)] <= Vp[(id, prod) in pset] <= pmax[(id,prod)])

Maybe I am misinterpreting what is being printed?

It should print a solution with the minimal L1-norm violation of your constraints. That isn’t the same as a solution with the minimal number of constraints violated.

Can you please clarify what the presolve option does? Is it safe to turn it off in general?

Solvers use heuristics to partially solve the problem without using a full algorithm like simplex or interior point. Common techniques in presolve include tightening variable bounds, removing fixed variables, removing co-linear or redundant constraints, etc.

It’s safe to turn off in general, but it can lead to increased solving times. You could just solve, then if you get infeasibility and no infeasibility certificate, turn off presolve and re-solve.

1 Like