Duals and constraint names for `PowerModels` and `InfrastructureModels`

I would like to (1) use PowerModels to construct a base DCOPF model and (2) make some modifications to the model (such as allowing load shed by adding new variables and modifying cosntraints). Beginning with an instance

file = "case30.m"
nd = PowerModels.parse_file(file)
pm = instantiate_model(network_data, DCPPowerModel, PowerModels.build_opf)
# ... modify `pm` to add new variables and constraints

is there a way to retrieve constraint names for each constraint and their duals/values? I see report_duals in InfrastructureModels, but I am not sure how to obtain a DCOPF model with constraint names and duals names beginning from my nd instance. Are the arguments I can pass to instantiate_model to do this?

In most cases the constraints do not have names and they are not stored, so you cannot access their dual values. (There are some exceptions.)

See these issues:

In most cases, if you want specific control over the subproblem and the duals, you should use PowerModels to parse the data and code your own implementation.

1 Like

This is exactly what motivated us to build PGLearn.jl, we similarly wanted 1) duals and 2) the ability to easily inspect/modify the formulations.

In PGLearn.jl each formulation is contained in one file – the one corresponding to PowerModels’ DCPPowerModel is DCOPF in dcp.jl. In that file you’ll find three functions – build_opf for building the model (which has all the constraints written plainly, so you can find/fork+modify them easily), extract_primal to get the primal solution, and extract_dual for the dual solution.

Here’s what your snippet would look like:

using PGLearn, PowerModels

# use PowerModels to parse the data like usual
file = "case30.m"
data = PowerModels.make_basic_network(PowerModels.parse_file(file))

# build the model
opf = PGLearn.build_opf(
  PGLearn.DCOPF,
  PGLearn.OPFData(data),
  nothing,  # optimizer constructor, e.g. Ipopt.Optimizer
)

# solve the model
PGLearn.solve!(opf)

# extract the metadata/primal/dual solutions
results = PGLearn.extract_result(opf)

The documentation is currently quite sparse, focusing mostly on the math. But if it’s a good fit for your use case I’m happy to help you get started using it!

2 Likes

Oh great, thanks for the link! I think this should work for my purposes!

BTW, I’m using Julia 1.8 on a mac and had an error building PGLearn

julia> using PowerModels, PGLearn

[ Info: Precompiling PGLearn [5d2523b5-5e96-4b1c-8178-da2b93e9175f]
ERROR: LoadError: UndefVarError: CHOLMOD not defined
Stacktrace:
  [1] getproperty(x::Module, f::Symbol)
    @ Base ./Base.jl:31
  [2] top-level scope
    @ ~/.julia/packages/PGLearn/m0m1y/src/opf/ptdf.jl:8
  [3] include(mod::Module, _path::String)
    @ Base ./Base.jl:419
  [4] include(x::String)
    @ PGLearn ~/.julia/packages/PGLearn/m0m1y/src/PGLearn.jl:1
  [5] top-level scope
    @ ~/.julia/packages/PGLearn/m0m1y/src/opf/opf.jl:304
  [6] include(mod::Module, _path::String)
    @ Base ./Base.jl:419
  [7] include(x::String)
    @ PGLearn ~/.julia/packages/PGLearn/m0m1y/src/PGLearn.jl:1
  [8] top-level scope
    @ ~/.julia/packages/PGLearn/m0m1y/src/PGLearn.jl:24
  [9] include
    @ ./Base.jl:419 [inlined]
 [10] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt64}}, source::Nothing)
    @ Base ./loading.jl:1554
 [11] top-level scope
    @ stdin:1
in expression starting at /Users/jakeroth/.julia/packages/PGLearn/m0m1y/src/opf/ptdf.jl:8
in expression starting at /Users/jakeroth/.julia/packages/PGLearn/m0m1y/src/opf/opf.jl:304
in expression starting at /Users/jakeroth/.julia/packages/PGLearn/m0m1y/src/PGLearn.jl:1
in expression starting at stdin:1
ERROR: Failed to precompile PGLearn [5d2523b5-5e96-4b1c-8178-da2b93e9175f] to /Users/jakeroth/.julia/compiled/v1.8/PGLearn/jl_8JGlEL.
Stacktrace:
 [1] error(s::String)
   @ Base ./error.jl:35
 [2] compilecache(pkg::Base.PkgId, path::String, internal_stderr::IO, internal_stdout::IO, keep_loaded_modules::Bool)
   @ Base ./loading.jl:1707
 [3] compilecache
   @ ./loading.jl:1651 [inlined]
 [4] _require(pkg::Base.PkgId)
   @ Base ./loading.jl:1337
 [5] _require_prelocked(uuidkey::Base.PkgId)
   @ Base ./loading.jl:1200
 [6] macro expansion
   @ ./loading.jl:1180 [inlined]
 [7] macro expansion
   @ ./lock.jl:223 [inlined]
 [8] require(into::Module, mod::Symbol)
   @ Base ./loading.jl:1144

with verision info:

julia> versioninfo()
Julia Version 1.8.5
Commit 17cfb8e65ea (2023-01-08 06:45 UTC)
Platform Info:
  OS: macOS (x86_64-apple-darwin21.4.0)
  CPU: 8 × Intel(R) Core(TM) i7-7820HQ CPU @ 2.90GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-13.0.1 (ORCJIT, skylake)
  Threads: 1 on 8 virtual cores

I temporarily resolved the issue by changing the LazyPTDF struct field since SparseArrays.CHOLMOD wasn’t recognized for me:


struct LazyPTDF <: AbstractPTDF
    N::Int  # number of buses
    E::Int  # number of branches
    islack::Int  # Index of slack bus

    A::SparseMatrixCSC{Float64,Int}  # incidence matrix
    b::Vector{Float64}  # branch susceptances
    BA::SparseMatrixCSC{Float64,Int}  # B*A
    AtBA::SparseMatrixCSC{Float64,Int}  # AᵀBA

    F::Factorization{Float64}#SparseArrays.CHOLMOD.Factor{Float64, Int64}  # Factorization of AᵀBA
    #=
    julia> using LinearAlgebra;
    julia> using SparseArrays;
    julia> A = sprandn(10,10,0.2);
    julia> A=A'*A+I;
    julia> cholesky(A)
    SuiteSparse.CHOLMOD.Factor{Float64}
    type:    LLt
    method:  simplicial
    maxnnz:  28
    nnz:     28
    success: true
    julia> supertypes(typeof(F))
    (SuiteSparse.CHOLMOD.Factor{Float64}, Factorization{Float64}, Any)
    =#

    # TODO: cache
end

Nice catch! Fixing it in Relax LazyPTDF factorization type by klamike · Pull Request #192 · AI4OPT/PGLearn.jl · GitHub

1 Like