Connecting a simple first-order solver to solve standard form linear program to JuMP

Dear All,

To demonstrate the usage of first-order algorithms to solve linear programs in a graduate course that I am TAing (MIT 15.084: Nonlinear Optimization, link:https://gitlab.com/vanparys-courses/15.084-sp23) I have written a simple Julia package called SimplePDHG.jl (https://github.com/Shuvomoy/SimplePDHG.jl) that is used to solve standard form linear programs of the form

\textrm{min}_x c^{\top} x \quad \textrm{subject to } Ax = b, x\geq 0

using the vanilla PDHG algorithm (https://odlgroup.github.io/odl/math/solvers/nonsmooth/pdhg.html).

I was wondering if there is a way to create a generic linear programming model in JuMP, call the SimplePDHG.jl solver within JuMP, and then get the solution? Any documentation or other resources that I can look into regarding this will be very useful.

Update. Thanks to @odow’s help, the package is working fine now with model created in JuMP! Package available at: https://github.com/Shuvomoy/SimplePDHG.jl

1 Like

I was wondering if there is a way to create a generic linear programming model in JuMP , call the SimplePDHG.jl solver within JuMP , and then get the solution?

The short answer is “yes.”

The longer answer is, “but you’ll need to write an MOI interface.”

Let me see if I can quickly mock something up.

The documentation to read is: Implementing a solver interface · JuMP. But we have some tooling that should make this easier.

Great, thanks so much @odow , please let me know!

Try this:

module SimplePDHG

import MathOptInterface as MOI
import SparseArrays

"""
    solve_pdhg(
        A::SparseArrays.SparseMatrixCSC{Float64,Int},
        b::Vector{Float64},
        c::Vector{Float64},
    )::Tuple{MOI.TerminationStatusCode,Vector{Float64}}
"""
function solve_pdhg(
    A::SparseArrays.SparseMatrixCSC{Float64,Int},
    b::Vector{Float64},
    c::Vector{Float64},
)::Tuple{MOI.TerminationStatusCode,Vector{Float64}}
    x = fill(NaN, size(A, 2))
    return MOI.OTHER_ERROR, x
end

MOI.Utilities.@product_of_sets(RHS, MOI.Zeros)

const OptimizerCache = MOI.Utilities.GenericModel{
    Float64,
    MOI.Utilities.ObjectiveContainer{Float64},
    MOI.Utilities.VariablesContainer{Float64},
    MOI.Utilities.MatrixOfConstraints{
        Float64,
        MOI.Utilities.MutableSparseMatrixCSC{
            Float64,
            Int,
            MOI.Utilities.OneBasedIndexing,
        },
        Vector{Float64},
        RHS{Float64},
    },
}

function MOI.add_constrained_variables(
    model::OptimizerCache,
    set::MOI.Nonnegatives,
)
    x = MOI.add_variables(model, MOI.dimension(set))
    MOI.add_constraint.(model, x, MOI.GreaterThan(0.0))
    ci = MOI.ConstraintIndex{MOI.VectorOfVariables,MOI.Nonnegatives}(x[1].value)
    return x, ci
end

mutable struct Optimizer <: MOI.AbstractOptimizer
    x_primal::Dict{MOI.VariableIndex,Float64}
    termination_status::MOI.TerminationStatusCode

    function Optimizer()
        return new(Dict{MOI.VariableIndex,Float64}(), MOI.OPTIMIZE_NOT_CALLED)
    end
end

function MOI.is_empty(model::Optimizer)
    return isempty(model.x_primal) &&
        model.termination_status == MOI.OPTIMIZE_NOT_CALLED
end

function MOI.empty!(model::Optimizer)
    empty!(model.x_primal)
    model.termination_status = MOI.OPTIMIZE_NOT_CALLED
    return
end

function MOI.supports_constraint(
    ::Optimizer,
    ::Type{MOI.VectorAffineFunction{Float64}},
    ::Type{MOI.Zeros},
)
    return true
end

MOI.supports_add_constrained_variables(::Optimizer, ::Type{MOI.Reals}) = false

function MOI.supports_add_constrained_variables(
    ::Optimizer,
    ::Type{MOI.Nonnegatives},
)
    return true
end

function MOI.supports(
    ::Optimizer,
    ::MOI.ObjectiveFunction{MOI.ScalarAffineFunction{Float64}},
)
    return true
end

function MOI.optimize!(dest::Optimizer, src::MOI.ModelLike)
    cache = OptimizerCache()
    index_map = MOI.copy_to(cache, src)
    @assert all(iszero, cache.variables.lower)
    @assert all(==(Inf), cache.variables.upper)
    A = convert(
        SparseArrays.SparseMatrixCSC{Float64,Int},
        cache.constraints.coefficients,
    )
    b = cache.constraints.constants
    c = zeros(size(A, 2))
    offset = cache.objective.scalar_affine.constant
    for term in cache.objective.scalar_affine.terms
        c[term.variable.value] += term.coefficient
    end
    if cache.objective.sense == MOI.MAX_SENSE
        c *= -1
    end
    dest.termination_status, x_primal = solve_pdhg(A, b, c)
    for x in MOI.get(src, MOI.ListOfVariableIndices())
        dest.x_primal[x] = x_primal[index_map[x].value]
    end
    return index_map, false
end

function MOI.get(model::Optimizer, ::MOI.VariablePrimal, x::MOI.VariableIndex)
    return model.x_primal[x]
end

function MOI.get(model::Optimizer, ::MOI.ResultCount)
    return model.termination_status == MOI.OPTIMAL ? 1 : 0
end

function MOI.get(model::Optimizer, ::MOI.RawStatusString)
    return "$(model.termination_status)"
end

MOI.get(model::Optimizer, ::MOI.TerminationStatus) = model.termination_status

function MOI.get(model::Optimizer, ::MOI.PrimalStatus)
    if model.termination_status == MOI.OPTIMAL
        return MOI.FEASIBLE_POINT
    else
        return MOI.NO_SOLUTION
    end
end

MOI.get(model::Optimizer, ::MOI.DualStatus) = MOI.NO_SOLUTION

MOI.get(::Optimizer, ::MOI.SolverName) = "PDHG"

end

using JuMP
model =  Model(SimplePDHG.Optimizer)
@variable(model, x >= 1)
@constraint(model, x <= 2)
@objective(model, Max, x + 1)
optimize!(model)
solution_summary(model; verbose = true)
2 Likes

cc @blegat, perhaps I should add this as a tutorial, with an implementation of simplex?

Great thanks so much @odow ! I will test it and let you know. Also, if I get time, I will try to do a step-by-step tutorial showing how to do that for my students and upload the video on Youtube.

1 Like

Ah. There’s probably a little bug. I think I setup Ax + b in {0}, so the b might be the negative of what you expect.

Sounds good, I will try to implement it step by step and keep you posted if I run into issues. But the code that you provided is very helpful for me to get started. Thanks again @odow

1 Like

Thanks, adding a tutorial for this would indeed be helpful. The code is short and concise but it’s difficult to get it right the first time.

2 Likes

Hi @odow , the code you provided is working great with minor modifications regarding the sign of b! The package (less than 350 lines of code) is available at https://github.com/Shuvomoy/SimplePDHG.jl, and can be installed by running

] add https://github.com/Shuvomoy/SimplePDHG.jl.git

Thanks again for your help, really appreciate it. The students will like it a lot!

3 Likes

@odow

I have an LP solver in C++, which can already run in Julia via this command.

objective, solution = sh.lp_solve(solver, aᵢ, aⱼ, aᵥ, b, c)
solver: solver options (dictionary)
aᵢ, aⱼ, aᵥ: CSR representation for matrix A

The form that I solve is

\textrm{min}_x c^{\top} x \quad \textrm{subject to } Ax \leq b

no x>= 0, How can I modify your code snippet to consider this solver

Thanks

I haven’t tested, but you’d need something like:

MOI.Utilities.@product_of_sets(RHS, MOI.Nonpositives) # <-- change

const OptimizerCache = MOI.Utilities.GenericModel{
    Float64,
    MOI.Utilities.ObjectiveContainer{Float64},
    MOI.Utilities.VariablesContainer{Float64},
    MOI.Utilities.MatrixOfConstraints{
        Float64,
        MOI.Utilities.MutableSparseMatrixCSC{
            Float64,
            Int,
            MOI.Utilities.OneBasedIndexing,
        },
        Vector{Float64},
        RHS{Float64},
    },
}

mutable struct Optimizer <: MOI.AbstractOptimizer
    x_primal::Dict{MOI.VariableIndex,Float64}
    termination_status::MOI.TerminationStatusCode

    function Optimizer()
        return new(Dict{MOI.VariableIndex,Float64}(), MOI.OPTIMIZE_NOT_CALLED)
    end
end

function MOI.is_empty(model::Optimizer)
    return isempty(model.x_primal) &&
        model.termination_status == MOI.OPTIMIZE_NOT_CALLED
end

function MOI.empty!(model::Optimizer)
    empty!(model.x_primal)
    model.termination_status = MOI.OPTIMIZE_NOT_CALLED
    return
end

function MOI.supports_constraint(
    ::Optimizer,
    ::Type{MOI.VectorAffineFunction{Float64}},
    ::Type{MOI.Nonpostives},  # <-- change
)
    return true
end

function MOI.supports_constraint( # <-- new. No variable bounds
    ::Optimizer,
    ::Type{MOI.VariableIndex},
    ::Type{
        <:Union{
            MOI.LessThan{Float64},
            MOI.GreaterThan{Float64},
            MOI.EqualTo{Float64},
            MOI.Interval{Float64},
        },
    }
)
    return false
end

function MOI.supports(
    ::Optimizer,
    ::MOI.ObjectiveFunction{MOI.ScalarAffineFunction{Float64}},
)
    return true
end

function MOI.optimize!(dest::Optimizer, src::MOI.ModelLike)
    cache = OptimizerCache()
    index_map = MOI.copy_to(cache, src)
    @assert all(==(-Inf), cache.variables.lower) # <-- change
    @assert all(==(Inf), cache.variables.upper)
    A = convert(
        SparseArrays.SparseMatrixCSC{Float64,Int},
        cache.constraints.coefficients,
    )
    b = cache.constraints.constants
    c = zeros(size(A, 2))
    offset = cache.objective.scalar_affine.constant
    for term in cache.objective.scalar_affine.terms
        c[term.variable.value] += term.coefficient
    end
    if cache.objective.sense == MOI.MAX_SENSE
        c *= -1
    end
    dest.termination_status, x_primal = solve_pdhg(A, b, c)
    for x in MOI.get(src, MOI.ListOfVariableIndices())
        dest.x_primal[x] = x_primal[index_map[x].value]
    end
    return index_map, false
end

Thanks, I will test it out.

Thanks @odow; it worked well with a few modifications. How can I extract the Q matrix from this objective?

\min_{x} \frac{1}{2} x^T Q x + c^T x \quad \text{subject to} \quad A x \leq b

Considering I have two solvers, lp_solve and qp_solve, I must differentiate LP vs QP and extract c for LP and (c, Q) for QP

You’ll need to add

function MOI.supports(
    ::Optimizer,
    ::MOI.ObjectiveFunction{MOI.ScalarQuadraticFunction{Float64}},
)
    return true
end

and then the function is in cache.objective.scalar_quadratic. But just as a list of terms, so you’ll need to loop over the cache.objective.scalar_quadratic.quadratic_terms to extract the coefficients.

I get this error

ERROR: LoadError: MathOptInterface.UnsupportedConstraint{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}: `MathOptInterface.ScalarAffineFunction{Float64}`-in-`MathOptInterface.LessThan{Float64}` constraint is not supported by the model.

but I added

function MOI.supports_constraint(
    ::Optimizer,
    ::Type{MOI.ScalarAffineFunction{Float64}},
    ::Type{MOI.LessThan{Float64}},
)
    return true
end

The RHS was setup such that it supported only VectorAffineFunction-in-Nonnegatives. Normally bridges should be able to reformulate that for you though. How did you initialize the model?

When I am trying to run, it takes forever from when I call !Optimize to get into it (I put a print statements with time stamp.)

println(Dates.format(now(), "HH:MM:SS"), " Solving")
optimize!(model)

function MOI.optimize!(dest::Optimizer, src::MOI.ModelLike)
    println(Dates.format(now(), "HH:MM:SS"), " Solver Caching ...")

I thought If I would add add_bridges = false it would help. Anyhow I changed my constraints so add_bridges flag is accepted but still takes very long. I used Mosek and that time was very small.

Do you have a reproducible example?

Let me work on that.

If I also have \underline{x} \leq x \leq \overline{x} , what should I add in addition to MOI.Utilities.@product_of_sets(RHS, MOI.Nonpositives)

because the following didn’t work

function MOI.supports_constraint(
    ::Optimizer,
    ::Type{MOI.VariableIndex{Float64}},
    ::Type{MOI.Interval},
)
    return true
end