Gurobi fails to invoke the primal simplex method when doing column generation?

I have a refreshing discovery today, based on my acquaintance Column generation · JuMP

wherein, the only thing I changed was the solver (I used Gurobi instead of HiGHS).

This is the master LP’s re-solving logging I acquired when I set Method to 0

LP warm-start: use basis

Iteration    Objective       Primal Inf.    Dual Inf.      Time
       0    3.6848477e+02   0.000000e+00   5.526316e-01      0s
       2    3.5932687e+02   0.000000e+00   0.000000e+00      0s

By comparison, this is the counterpart when Method === 1 (the default setting, just the same as the doc).

LP warm-start: use basis

Iteration    Objective       Primal Inf.    Dual Inf.      Time
       0   -3.3333333e+29   1.333333e+30   3.333333e-01      0s
       7    2.9900000e+02   0.000000e+00   0.000000e+00      0s

Okay I believe that everyone would find the former a lot more natural at least in terms of numerics—the primal simplex method which you need to switch to manually—unfortunately this is not the default foolproof setting taught by JuMP.

So the conclusion is:

  • JuMP’s doc is adopting a default setting that doesn’t attain its expected performance in practice.

PS The motivation that yielded my idea today, was that I was wondering why the dual simplex method is the default. My guess is: modern solvers are row generation based, so the spirit of column generation was opposed to that. And lucky enough—I discovered this outcome—affirming my surmise.

Code
using JuMP
import DataFrames
import Gurobi
import SparseArrays
const GRB_ENV = Gurobi.Env();
struct Piece
    w::Float64
    d::Int
end
struct Data
    pieces::Vector{Piece}
    W::Float64
end
function get_data()
    data = [
        75.0 38
        70.0 41
        68.4 34
        65.5 23
        59.6 18
        53.8 33
        53.0 36
        51.0 41
        50.2 35
        32.2 37
        30.8 44
        29.8 49
        20.1 37
        16.2 36
        14.5 42
        11.0 33
        8.6 47
        8.2 35
        6.6 49
        5.1 42
    ]
    return Data([Piece(data[i, 1], data[i, 2]) for i in axes(data, 1)], 100.0)
end
function solve_pricing(data::Data, π::Vector{Float64})
    I = length(π)
    model = Model(() -> Gurobi.Optimizer(GRB_ENV))
    set_silent(model)
    @variable(model, y[1:I] >= 0, Int)
    @constraint(model, sum(data.pieces[i].w * y[i] for i in 1:I) <= data.W)
    @objective(model, Max, sum(π[i] * y[i] for i in 1:I))
    optimize!(model)
    assert_is_solved_and_feasible(model)
    number_of_rolls_saved = objective_value(model)
    if number_of_rolls_saved > 1 + 1e-8
        # Benefit of pattern is more than the cost of a new roll plus some
        # tolerance
        return SparseArrays.sparse(round.(Int, value.(y)))
    end
    return nothing
end
data = get_data();
I = length(data.pieces)
J = 1_000  # Some large number

patterns = map(1:I) do i
    n_pieces = floor(Int, data.W / data.pieces[i].w)
    return SparseArrays.sparsevec([i], [n_pieces], I)
end

model = Model(() -> Gurobi.Optimizer(GRB_ENV))
# JuMP.set_attribute(model, "Method", 1)
@variable(model, x[1:length(patterns)] >= 0)
@objective(model, Min, sum(x))
@constraint(model, demand[i in 1:I], patterns[i]' * x >= data.pieces[i].d)
optimize!(model)
assert_is_solved_and_feasible(model)
solution_summary(model)

while true
    # Solve the linear relaxation
    optimize!(model)
    assert_is_solved_and_feasible(model; dual = true)
    # Obtain a new dual vector
    π = dual.(demand)
    # Solve the pricing problem
    new_pattern = solve_pricing(data, π)
    # Stop iterating if there is no new pattern
    if new_pattern === nothing
        @info "No new patterns, terminating the algorithm."
        break
    end
    push!(patterns, new_pattern)
    # Create a new column
    push!(x, @variable(model, lower_bound = 0))
    # Update the objective coefficient of the new column
    set_objective_coefficient(model, x[end], 1.0)
    # Update the non-zeros in the coefficient matrix
    for (i, count) in zip(SparseArrays.findnz(new_pattern)...)
        set_normalized_coefficient(demand[i], x[end], count)
    end
    println("Found new pattern. Total patterns = $(length(patterns))")
end

PS2 I further did a test, the results indicate that the default setting (dual simplex method) is indeed the best go-to algorithm for a CTPLN master LP, where I used the (Barrier+noCrossover) as a baseline.

# Barrier+noCrossover
Row  │J      cg_time  cg_rgap     decen_ub   decen_rgap  decen_time  Kverm  Kvermu   KverM  msttimem  msttimemu  msttimeM 
     │Int64  Int64    Float64     Float64    Float64     Float64     Int64  Float64  Int64  Float64   Float64    Float64  
─────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
   1 │16192      300  0.0528173   2.05448e5  0.0499749      317.78       1  2.193        3    0.1394    1.41158   3.92962 
   2 │16192      900  0.0173547   2.01733e5  0.016572       961.554      1  3.21189      5    0.1394    2.48585   7.85246 
   3 │16192     2700  0.00391167  2.00416e5  0.00391474    2817.17       1  4.58745      7    0.1394    4.71001  21.289   
# Dual simplex (default)
 Row │J      cg_time  cg_rgap     decen_ub        decen_rgap  decen_time  Kverm  Kvermu   KverM  msttimem    msttimemu  msttimeM 
     │Int64  Int64    Float64     Float64         Float64     Float64     Int64  Float64  Int64  Float64     Float64    Float64  
─────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
   1 │16192      300  0.0350881   204397.0        0.0338806      332.166      1  2.48141      3  0.00010705    1.14491   7.80573 
   2 │16192      900  0.0130845        2.01481e5  0.0126484      988.064      1  3.48413      5  0.00010705    2.26494  16.9965  
   3 │16192     2700  0.00283829       2.00259e5  0.00277348    2802.26       1  4.87148      8  0.00010705    4.66875  81.4494  

JuMP uses the default settings chosen by Gurobi.

This makes sense for a one-shot solve. But not very in this case, where the master LP is built-and-solved incrementally. I’m curious if this behavior (cannot invoke primal simplex) is Gurobi-exclusive. Do you have some interests to take a look at other common solvers? e.g. Highs, Cplex etc. (Or anyone has a different solver has an interest to take a look?)

What’s your question with the logging?

When you set Method = 0, you get primal simplex.

When you set Method = 1, you get dual simplex.

When you set Method = -1 or you don’t set, Gurobi chooses.

Here’s the doc: Parameter Reference - Gurobi Optimizer Reference Manual

There are 3 known decomposition techniques in our field:

  1. dual decomposition
  2. DW decomposition
  3. Benders decomposition

In the DW decomposition, column generation is used.
In the Benders (or dual) decomposition, row generation (cut generation) is used.

They actually has some sort of dual relation.

The current conclusion is: the dual simplex method is suitable for the row generation case, whereas the primal simplex method is suitable for the column generation case. (You can, e.g., ask some chatbot to confim this.)

In a previous topic I’ve mentioned that I favor cut generation (cutting plane method) over column generation. Because cutting plane method is more intuitive and more widely studied. And modern solvers like Gurobi supports cut generation better. This is why the dual simplex method is typically the default. (And thus the results in my #1 post).

The standard method to do column generation is just specified in the JuMP’s column generation doc, which by itself is correct. But I found that Gurobi is not invoking the fitting Method. I think this is a mismatch. If you are aware of any stuff from Gurobi (I remember there was one bro here), I think the current situation is worth reporting.

The standard method to do column generation is just specified in the JuMP’s column generation doc, which by itself is correct

What do you mean by this? We don’t specify anything.

But I found that Gurobi is not invoking the fitting Method

What evidence do you have for this? Looks okay to me. Note that Method = 1 is not the default. The default is Method = -1

The primal simplex method features maintaining a primal feasible point, so the “Primal Inf.” column is typically a zero column, whereas the “Dual Inf.” decrease to zero gradually. (The above logging belongs to the primal simplex’s logging).

Whereas the dual simplex method features maintaining a dual feasible point, so the “Dual Inf.” column is typically a zero column, whereas the “Primal Inf.” decrases to zero gradually.


You call the standard column generation API, i.e. set_normalized_coefficint (although I’ve never written my code with that). That API pertains to the Gurobi’s standard API of modifying a coefficient of a constraint. So I assume it is strongly connected to the usage of column generation.
Because when we do column generation, we need to modify coefficient of constraints and objective (IIRC), and add variables (i.e. new columns). (By comparison, the cut generation is a lot more simpler in conception, where we only need to add a constraint).

You can observe the simplex’s logging, in most cases the simplex logging is identified as the dual simplex’s logging (you can, e.g. ask a chatbot, let it tell you which method corresponds to a particular simplex logging). The dual simplex method is the de-facto default algorithm unless the scale of the LP becomes large.

What’s the question? I’m a bit confused. So far my answers are “yes, and?”

Following the JuMP’s doc about column generation, the user’s intention is to perform a column generation process on the master LP.

The expected behavior for Gurobi should be to invoke the primal simplex method (Method = 0).
But Gurobi cannot identify the user’s intention and using Method = 1 instead. This is a mismatch and affects due performance.

The Gurobi optimizer works well if user do cut generation only,. (e.g. I’m with this style). But the current behavior is not friendly to people who wants to do column generation, unless we explicitly let them know there is an “Method = 0” manual option.

This is just not the case. Gurobi is a black-box. It may decide to do anything, so long as it solves the problem.

Regardless, this is not a problem with JuMP, Julia, or even Gurobi.jl. It’s an internal algorithmic choice made by Gurobi.

:blush:I’m not averse to it. After all, CG is not my go-to style.