Struggle at using JuMP/MathOptInterface for calculating number of constraints

Hello :smiley:

I am using JuMP to solve a chance-constrained optimization, the original package is very old and I met issues of reproducing it.

Following is one package I am trying to solve but stuck for more than a week:
jacobians.jl
These few lines use MathProgBase, which is no longer used now.

My JuMP is v1.22.2, I am wondering if there is any way in JuMP/ MathOptInterface to do the same stuff as in jacobians.jl?

I tried a lot of versions to solve the jacobians.jl but failed. I don’t know what I can do but finally put a question here. Thanks a lot, a lot for any answer … :pray:

Hi @SSGD, welcome to the forum.

See Computing Hessians · JuMP. You need to adapt it slightly to make:

julia> using JuMP, Ipopt, SparseArrays

julia> function build_nlp_evaluator(model)
           rows = Any[]
           nlp = MOI.Nonlinear.Model()
           for (F, S) in list_of_constraint_types(model)
               if !(F <: VariableRef) 
                   for ci in all_constraints(model, F, S)
                       push!(rows, ci)
                       object = constraint_object(ci)
                       MOI.Nonlinear.add_constraint(nlp, object.func, object.set)
                   end
               end
           end
           MOI.Nonlinear.set_objective(nlp, objective_function(model))
           evaluator = MOI.Nonlinear.Evaluator(
               nlp,
               MOI.Nonlinear.SparseReverseMode(),
               index.(all_variables(model)),
           )
           return evaluator, rows
       end
build_nlp_evaluator (generic function with 1 method)

julia> begin
           model = Model(Ipopt.Optimizer)
           set_silent(model)
           @variable(model, x[i = 1:2], start = -i)
           @constraint(model, g_1, x[1]^2 <= 1)
           @constraint(model, g_2, (x[1] + x[2])^2 <= 2)
           @objective(model, Min, (1 - x[1])^2 + 100 * (x[2] - x[1]^2)^2)
           optimize!(model)
       end

julia> x_star = value.(x)
2-element Vector{Float64}:
 0.7903587565231842
 0.6238546272155127

julia> evaluator, rows = build_nlp_evaluator(model);

julia> MOI.initialize(evaluator, [:Jac]);

julia> sparsity = MOI.jacobian_structure(evaluator)
3-element Vector{Tuple{Int64, Int64}}:
 (1, 1)
 (2, 1)
 (2, 2)

julia> I, J, V = first.(sparsity), last.(sparsity), zeros(length(sparsity));

julia> MOI.eval_constraint_jacobian(evaluator, V, x_star)

julia> Jac = SparseArrays.sparse(I, J, V, length(rows), num_variables(model))
2×2 SparseMatrixCSC{Float64, Int64} with 3 stored entries:
 1.58072   ⋅ 
 2.82843  2.82843

I should add a section on Jacobians to the documentation.

2 Likes

Thanks a lot @odow for the replay, while issues are remaining:

The original jacobians.jl uses only a part of the Jacobians by indexing the particular variables (x = [v;θ]) and the corresponding constraints (part of the nonlinear constraints: flows_constr):

The function in jacobians.jl is defined as:

function extract_jacobian(m, nonlinear_constraints, variables)

The place where this function is used is from rows 126 -130 in model_nl.jl shown as following:

125 # linearized map from (v,θ) to flows
126 x = [v;θ]
127 flows = [pij;pji;qij;qji]
128 flows_constr = [pij_constr;pji_constr;qij_constr;qji_constr]
129
130 vθ_to_flows_const, vθ_to_flows_J = extract_jacobian(m, flows_constr, x)

How to extract this part of the Jacobians (jac[rows, cols]) based on x = [v;θ] and flows_constr rather than the whole Jacobians?

If you want only the Jacobian of a subset of the constraints, then do something like

julia> function build_nlp_evaluator(model, nonlinear_constraints)
           nlp = MOI.Nonlinear.Model()
           for ci in nonlinear_constraints
                object = constraint_object(ci)
                MOI.Nonlinear.add_constraint(nlp, object.func, object.set)
           end
           MOI.Nonlinear.set_objective(nlp, objective_function(model))
           evaluator = MOI.Nonlinear.Evaluator(
               nlp,
               MOI.Nonlinear.SparseReverseMode(),
               index.(all_variables(model)),
           )
           return evaluator, nonlinear_constraints
       end

Thanks a lot @odow, following your answers, I have two more questions, hope you are fine with that :grinning::

  1. Does the function function build_nlp_evaluator(model) consider all the constraints?
rows = Any[]
nlp = MOI.Nonlinear.Model()
for (F, S) in list_of_constraint_types(model)
    if !(F <: VariableRef) 
        for ci in all_constraints(model, F, S)
            push!(rows, ci)
            object = constraint_object(ci)
            MOI.Nonlinear.add_constraint(nlp, object.func, object.set)
        end
    end

It seems the rows only include the constraints defined by @constraints, while my model includes both @constraint and @NLconstraint, the @NLconstraint part I did not see in the rows. Then I think the evaluator is running without considering the @NLconstraint part, right? Then how to get the the Jacobians considering all the constraints?

  1. If the first question can be solved, then how to get a subset of the Jacobian, let’s say in a jac[rows,cols] type, where the rows correspond to a part of my @NLconstraint and cols is some of my variables? How to find their correspondence?

Thank you very much in advance!

The @NL macros are part of the legacy interface. You should change them to use @constraint instead.

For information on the new nonlinear interface, see:

Thank you very much @odow, then this solves my issues.

Have a good day!

1 Like