Relaxing Big-M constraint

Hi,

I’m formulating an optimization problem in JuMP and need some advice on relaxing a logical constraint.

The logic is as follows:
If x>260, then Z=0.
(Note: Z can only be either 0 or Zmax no values in between.)

I’ve currently implemented this using the Big-M method as:

x−260≤M(1−Z)

However, this introduces a binary variable, and I’m looking for a way to relax or linearize this constraint to avoid using binary variables altogether.

Is there a standard way to reformulate this logic in a continuous (linear) form, or perhaps an approximation that removes the need for binary variables?

1 Like

Therefore from the perspective of decision Z, your problem is nonconvex.

Therefore you must use integer constraint to reflect the truth.

I invent a logic that emulate yours, I don’t know if it is appropriate

Y = num_items_limit = 10
model = JuMP.Model()
JuMP.@variable(model, y >= 0) # number of items you purchase (might be integer, or continuous)
JuMP.@variable(model, 0 <= u <= 1, Int) # whether or not to set up
JuMP.@constraint(model, y <= (Y)u) # if no set up, then y == 0
JuMP.@expression(model, x, y - 260)
JuMP.@expression(model, z, 1 - u)

Note that although you may need integer constraints. But you’ll typically solve continuous relaxation problems first. So it won’t complicate the situation by introducing necessary integer constraints.

1 Like

Hi @leila_Ra, welcome to the forum :smile:

As @WalterMadelim says, there is on way to reformulate your model without introducing binary variables.

One simple approximation you could make is to relax the binary restrictions on Z so that it can take continuous values. Then, in the solution, you could round the the nearest integer, fix the variable, and re-solve. Alternatively, you could just solve two models: one where Z=0 and one where Z=1, and then pick the best solution.

Assuming that HiGHS is used as the solver for this (MI)LP, please consider the following two questions:

  1. If you introduce the binary variable Z, how many binary variables will your model have in total?
    Roughly speaking, if the total number of binary variables exceeds 200, the computation time may reach several tens of seconds.
  2. Does your problem need to be solved repeatedly?
    The total computation time can be estimated as:
    (time for a single solve) × (number of solves needed)

You should consider whether the total computation time is acceptable for your application.

  • If yes: Introducing binary variables should not be an issue.
  • If not: You may want to consider continuous relaxation techniques. For example, you can use a nonlinear approximation function for the binary variable in your model and reformulate the problem as an NLP.
1 Like

Roughly speaking, if the total number of binary variables exceeds 200, the computation time may reach several tens of seconds.

Okay, I’ve seen a few claims like this recently, so your post has me interested :smile:

Is this based on personal experience? Or have you read something somewhere (e.g., an LLM)?

In general, this is not true. There are no simple rules for the run time of a model and the number of binary/integer variables. Here’s an easy MIPLIB instance with 20,677,405 binary variables, and here’s an unsolved MIPLIB instance with 160 binary variables.

Maybe I need to write a tutorial on this for the JuMP documentation?

4 Likes

Yes, that’s just based on my personal experience. Recently, I optimized a transportation problem with 401 constraints, 10,301 continuous variables, and 114 binary variables, and it took me about 30 seconds to solve it.

However, I agree with your point—it’s not really accurate to estimate the runtime just by looking at the number of binary variables. The constraints in the model also play a part. It’s usually best to actually run the model and see how long it takes.

1 Like