Cannot set the starting point for LP

Hello, I m trying to solve an LP where my variable has the following form

@variable(model, x[i=1:N-1, j=i+1:N, k=1:d, l=1:d] >= 0)

I ve been struggling to set a starting point, I tried

set_start_value

but i get errors, can you please let me know how I can fix this issue? should be trivial but nothing seems to work … Thank you!

1 Like

Also, when I do
set_start_value.(all_variables(model), value.(all_variables(model)))

so that I can do warm starting from previous LPs, I get the following error:
ERROR: LoadError: MathOptInterface.UnsupportedAttribute …

Which solver are you using? Some don’t support starting values

Hi @Aida_Khajavirad, since this is your first post, welcome!

Some solvers (like Clp) don’t support start values. I’ve opened a pull request to clarify this in the documentation:
https://github.com/jump-dev/JuMP.jl/pull/2667

1 Like

Hi and welcome @Aida_Khajavirad

Yeah, as mentioned above some solver do not support initial values for variables. e.g. for GLPK If we call set_start_value before optimize! we only have a warning:

using JuMP, GLPK

model = Model(GLPK.Optimizer)
@variable(model, 0 <= x <= 2)
@variable(model, 0 <= y <= 30)
@objective(model, Max, 5x + 3 * y)
@constraint(model, con, 1x + 5y <= 3)

JuMP.set_start_value(all_variables(model)[1],1.5)
optimize!(model)

┌ Warning: MathOptInterface.VariablePrimalStart() is not supported by
 MathOptInterface.Bridges.LazyBridgeOptimizer{GLPK.Optimizer}. 
This information will be discarded.

However, if we have already optimized our model and we try to use set_start_value again, we have the error you describe:

optimize!(model)
JuMP.set_start_value(all_variables(model)[1],1.5)
ERROR: MathOptInterface.UnsupportedAttribute{MathOptInterfac

One solver you can try if you need to use set_start_value is Ipopt.

using JuMP, Ipopt

model = Model(Ipopt.Optimizer)
@variable(model, 0 <= x <= 2)
@variable(model, 0 <= y <= 30)
@objective(model, Max, 5x + 3 * y)
@constraint(model, con, 1x + 5y <= 3)

JuMP.set_start_value(all_variables(model)[1],1.5)

optimize!(model)

julia> optimize!(model)
This is Ipopt version 3.13.4, running with linear solver mumps.
NOTE: Other linear solvers might be more efficient 
(see Ipopt documentation).

Number of nonzeros in equality constraint Jacobian...:        0
Number of nonzeros in inequality constraint Jacobian.:        2
Number of nonzeros in Lagrangian Hessian.............:        0

However it nicely warns you that “Other linear solvers might be more efficient (see Ipopt documentation).”

I have never used initial values for LP problems though since solvers are so efficient at it, I would like also to know if initial values are common for LP as they are for NLP.

2 Likes

I would like also to know if initial values are common for LP as they are for NLP.

For LP interior point, not common. There are two main cases for start values in the linear world:

  • As an initial feasible solution for MIPs to provide a bound. This is only useful if the initial point is feasible.
  • As an initial basic feasible solution from which to warm-start dual simplex in branch-and-bound. (But this generally requires the full basis information, e.g., from a previous solve, not just the initial primal values)

In general, start values for LP don’t provide a benefit, which is why solvers don’t support an API for them.

2 Likes

Hello Thanks four your response; I am trying to solve large LPs using cutting planes (exponentially many constraints but the separation problem can be solved in polytime), so I add cuts in rounds and would like to do dual simplex with warm starting using the solution of the previous LP. This is common practice with discrete optimizers

Where can I find which LP solvers take starting point? I read in some previous posts JUMP used to do warm starting for a sequence of LPs by default before and now this can be enforced by using:

vars = JuMP.all_variables(model)
JuMP.set_start_value.(vars, JuMP.value.(vars))

I tried Clp, GLPK and HiGHS and they dont take it, which simplex based LP solver can I use that takes this type of warmstarting?

Many thanks for helping me out!

1 Like

I am trying to solve large LPs using cutting planes

I do this in SDDP.jl (GitHub - odow/SDDP.jl: Stochastic Dual Dynamic Programming in Julia) so the high-level answer is: “mostly it just works without you doing anything.” A good way to check is just to test multiple different solvers and see which is the fastest. There isn’t a way to check the exact features supported by a solver.

The in-depth answer to this is a little complicated, because it depends on the ability of the underlying solver, and technical details in the Julia wrapper of how that solver interacts with JuMP.

There are two main categories of solver interfaces to JuMP:

  • packages which support in-place modification (e.g., adding cuts)
  • packages which don’t support in-place modification

If a package doesn’t support in-place modification, we build a new model every solve, which discards solution information. This includes solvers like Clp.jl. Cutting plane methods using Clp are going be slow.

If the package does support in-place modification, then a workflow like

optimize!(model)
@constraint(model, ...)
optimize!(model)

efficiently updates the model in-place and preserves the prior solution information (including basis information).

GLPK, HiGHS, Gurobi, and CPLEX are all be in this camp. (You may need to set a parameter to force them to use dual simplex instead of barrier, but this likely matters only for the first solve. If you solve the same problem repeatedly it will choose dual simplex automatically.)

2 Likes

Got it, many thanks for clarifying all this for me, indeed once I switched from Clp to Gurobi, I observed a huge speedup! and dual infeasibility is zero at the first iteration of each intermediate LP confirming your point!

Cheers,
Aida

1 Like