JuMP support for "linear conditional expression" reformulation as MILP in 2025

Hello,

I’d like to know to what extent the present (mid-2025) state of JuMP includes the work of @rdeits in GitHub - rdeits/ConditionalJuMP.jl: Automatic transformation of implications and complementarity into mixed-integer models in Julia. In particular, is JuMP able to generate big-M MILP reformulations for “linear implications” (not sure what the the canonical name for this) ? :

(linear expression) => (linear contraint)

I see that the revamped nonlinear modeling allows using Julia’s ifelse, when wrapped in an @expression, but can this generate the big-M reformulation (with automatic computation of the M bound?) or does it “simply” means that JuMP can compute the gradient and forward it to an NLP solver?

I also see that JuMP has been much upgraded for Constraint Programming, but the tutorial is more about CP set constraints (AllDifferent) than implications.

Also back in 2019 in the ticket JuMP v0.19 support · Issue #48 · rdeits/ConditionalJuMP.jl · GitHub, @blegat mentioned that the newly added MOI’s IndicatorSet added to JuMP could be a way to implement implications, and indeed looking at SCIP doc it’s indeed close to the target.

For a bit of background, I recently started playing with MiniZinc to explore some ideas. And since I ended up selected HiGHS as a solver, I may be better go back to my confort zone using JuMP. Here is a toy example:

% Mockup of decision tree formal checking,
% with a slightly more complex example compared to decision_tree_check.mzn
%
% Expected solution is UNSATISFIABLE.
%
% However, if the `tol` parameter is chosen smaller than about 2e-6,
% then the HiGHS solver finds solutions which are sometimes strange.
% Examples:
% - with tol=1e-6: x = 99.999999; xpos = 99.99999799999999; xneg = 9.99999993922529e-07;
%   and delta_x = -2.00000000916134e-06; so that balance_ok = false by a small margin
% - with tol=1e-7: delta_x = -2.84...e-14, but balance_ok = false;
%   (while balance_neg = true and balance_pos = true when using the variant without abs())
%
% Remark: the FlatZinc compiled file is 111 lines long, with many hardcoded 1e-06 values.
% Also, I don't see where the value of the tol parameters gets in...
%
% PH, June 2025

float: tol=1e-5; % tolerance for accepting delta_x==0.0

set of float: bfloat = -100.0..100.0; % bounded float
var bfloat: x;
var bfloat: xpos; % meant to be the positive part of x: max(0, x)
var bfloat: xneg; % meant to be the negative part of x: max(0,-x)

% decision tree
var bool: dt;
dt = if x>=0.0
  then (xpos=x   /\ xneg=0.0)
  else (xpos=0.0 /\ xneg=-x)
endif; 

% Operation constraints to be satified
var bfloat: delta_x;
delta_x = xpos - xneg - x ; % should be zero because x = xpos-xneg
var bool: balance_ok;
%balance_ok = delta_x == 0.0; % Compilation fails: "Error: solver backend cannot handle constraint: float_lin_ne",
%                         i.e. there is no MILP implementation for the constraint "linear floating-point expression != some number" 
balance_ok = abs(delta_x) <= tol;
% variant which manually implements the abs function:
% var bool: balance_pos;
% var bool: balance_neg;
% balance_pos = delta_x <=  tol;
% balance_neg = delta_x >= -tol;
% balance_ok = balance_pos /\ balance_neg;

var bool: op = balance_ok;

% Solving for Decision tree & Not op, that is solving for NOT( DT! | OP), that is NOT(DT => OP)
constraint dt /\ not op;

solve satisfy;

The short answer is “no”.

We have indicator constraint for bool implies {inequality}

We don’t have support for more complicated expressions.

If you use ifelse it will create a nonlinear expression to be passed to a solver like Ipopt. It doesn’t do a MIP reformulation.

Is something like GitHub - hdavid16/DisjunctiveProgramming.jl: A JuMP extension for Generalized Disjunctive Programming helpful for you?

Thanks for the pointers.

For JuMP, I had probably seen once or twice the --> implication constraint, but had never needed it and had forgotten about it. It is clearly an important foundation.

For GitHub - hdavid16/DisjunctiveProgramming.jl: A JuMP extension for Generalized Disjunctive Programming, it’s good to know that it exist, but I’ll have to take a closer look.

All in all, it seems that only ConditionalJuMP is/was able to automatically derive the big M value. Looking closer at DisjunctiveProgramming.jl README, I see that there is a tighten option for Big-M, which sounds promising.

1 Like

And thanks to the reference mentioned in the README: Convex generalized disjunctive programming (GDP), now I know that what I playing with is a Disjunctive Program. From what I’ve just quickly read, the linearity is implicit unless the “Generalized” term is specified. And I’ve added to my library:

1 Like