Resuming a run by writing and reading back LP file

I am implementing bender’s decomposition using JuMP in Julia. Feasibility cuts are added to the master problem continuously until the subproblem is feasible for the candidate solution of master problem. The run takes too long and I would need to turn off the computer after a while. I was able to save the latest state of the master problem (as it is implemented in Global scope) by terminating the run and using

write_to_file(master_problem, “MP_day1.lp”)

But, I am not able to understand how to read the LP file saved above, and continue the run. I essentially want to start with all those feasibility/optimality cuts added in previous run to master problem retained to resume the run.

I am using Mosek solver. HiGHS also works but slower on my model.

Please guide me.

I am looking for something like

  1. Read LP file
  2. convert this to JuMP Model
  3. rest of the work flow is same (so all additional cuts generated in previous run are recovered in master-problem).

Should I be able to query information related to variables in my master_problems with the names I defined? I am asking it because, when I saw the LP file, I found variable names are changed (JuMP containers are unwrapped, I suppose).

Hi @saikrishna-nadella, welcome to the forum :smile:

Don’t try to save the state of the first-stage problem by writing it to a file. As you have seen, this is a lossy operation because we do not retain the higher-level structure like JuMP containers.

Instead, save the list of cuts to a file as you find them. Then, later, you can read them in.

This should point you in the right direction:

julia> using JuMP

julia> import HiGHS

julia> import JSON

julia> G = [
           0 3 2 2 0 0 0 0
           0 0 0 0 5 1 0 0
           0 0 0 0 1 3 1 0
           0 0 0 0 0 1 0 0
           0 0 0 0 0 0 0 4
           0 0 0 0 0 0 0 2
           0 0 0 0 0 0 0 4
           0 0 0 0 0 0 0 0
       ];

julia> function build_firststage(G)
           n = size(G, 1)
           M = -sum(G)
           model = Model(HiGHS.Optimizer)
           set_silent(model)
           @variable(model, x[1:n, 1:n], Bin)
           @variable(model, θ >= M)
           @constraint(model, sum(x) <= 11)
           @objective(model, Min, 0.1 * sum(x) + θ)
           function solve_subproblem(x_bar)
               model = Model(HiGHS.Optimizer)
               set_silent(model)
               @variable(model, x[i in 1:n, j in 1:n] == x_bar[i, j])
               @variable(model, y[1:n, 1:n] >= 0)
               @constraint(model, [i = 1:n, j = 1:n], y[i, j] <= G[i, j] * x[i, j])
               @constraint(model, [i = 2:n-1], sum(y[i, :]) == sum(y[:, i]))
               @objective(model, Min, -sum(y[1, :]))
               optimize!(model)
               @assert is_solved_and_feasible(model; dual = true)
               return (obj = objective_value(model), y = value.(y), π = reduced_cost.(x))
           end
           return model, solve_subproblem    
       end
build_firststage (generic function with 1 method)

julia> begin
           model, solve_subproblem = build_firststage(G)
           x, θ = model[:x], model[:θ]
           for k in 1:100
               optimize!(model)
               @assert is_solved_and_feasible(model)
               lower_bound = objective_value(model)
               x_k = value.(x)
               ret = solve_subproblem(x_k)
               upper_bound = (objective_value(model) - value(θ)) + ret.obj
               if abs(upper_bound - lower_bound) / abs(upper_bound) < 1e-6
                   break
               end
               open("cuts.jsonl", "a") do io
                   cut = JSON.json(Dict("obj" => ret.obj, "pi" => ret.π, "x" => x_k))
                   return println(io, cut)
               end
               @constraint(model, θ >= ret.obj + sum(ret.π .* (x .- x_k)))
           end
           optimize!(model)
           @assert is_solved_and_feasible(model)
           solution_summary(model)
       end
* Solver : HiGHS

* Status
  Result count       : 1
  Termination status : OPTIMAL
  Message from the solver:
  "kHighsModelStatusOptimal"

* Candidate solution (result #1)
  Primal status      : FEASIBLE_POINT
  Dual status        : NO_SOLUTION
  Objective value    : -5.10000e+00
  Objective bound    : -5.10000e+00
  Relative gap       : 0.00000e+00
  Dual objective value : NaN

* Work counters
  Solve time (sec)   : 7.37485e-03
  Simplex iterations : 11
  Barrier iterations : -1
  Node count         : 1


julia> begin
           new_model, solve_subproblem = build_firststage(G)
           new_x, new_θ = new_model[:x], new_model[:θ]
           for data in JSON.parse.(readlines("cuts.jsonl")) 
               # The `reduce` part depends on the type of your state variables. We need
               # to convert a list-of-lists from JSON back into a Matrix
               π = reduce(hcat, data["pi"])  
               x_k = reduce(hcat, data["x"])
               @constraint(new_model, new_θ >= data["obj"] + sum(π .* (new_x .- x_k)))
           end
           optimize!(new_model)
           @assert is_solved_and_feasible(new_model)
           solution_summary(new_model)
       end
* Solver : HiGHS

* Status
  Result count       : 1
  Termination status : OPTIMAL
  Message from the solver:
  "kHighsModelStatusOptimal"

* Candidate solution (result #1)
  Primal status      : FEASIBLE_POINT
  Dual status        : NO_SOLUTION
  Objective value    : -5.10000e+00
  Objective bound    : -5.10000e+00
  Relative gap       : 0.00000e+00
  Dual objective value : NaN

* Work counters
  Solve time (sec)   : 9.23943e-03
  Simplex iterations : 11
  Barrier iterations : -1
  Node count         : 1
1 Like

Hi @odow,
Thank you very much :handshake: for your example implementation which solves my problem to the point.

1 Like

I am new to Julia and JuMP. Thanks for helping out. Such help encourages using these programming languages/tools.

1 Like

Thanks! We try hard to make this forum a friendly place to post questions :smile:

1 Like