Fail to re-add new constraints as a new upper bound of an expression in JuMP model

I have a JuMP model called EP. I am trying to delete and re-add constraints called cMaxCap as an upper bound for an expression eTotalCap. This upper bound has values max_cap_mw. However, after deleting and re-adding, my new constraints cMaxCap couldn’t obtain the new values of max_cap_mw. Below is the detail.

  1. At each iteration, delete the old constraints cMaxCap. Check these constraints has being successfully deleted.

if haskey(EP.ext, :cMaxCap)
println(“Deleting old constraints”)
for (y, cref) in EP.ext[:cMaxCap]
delete(EP, cref)
end
empty!(EP.ext[:cMaxCap])
else
println(“there is no cMaxCap”)
end

This step is successful.

  1. update the input data gen[y].max_cap_mw and check the inputs are updated correctly.

Update gen[y].max_cap_mw externally (assumed done)

This step is successful.

  1. add new cMaxCap constraints and check the new constraints. However, the new value cannot be feed into the new constraints at this step, as screenshot shows.

EP.ext[:cMaxCap] = Dict{Any, ConstraintRef}()

for y in MAX_CAP
println("gen[$y].max_cap_mw = ", gen[y].max_cap_mw) ## results(in screenshot below) shows new values
cap = gen[y].max_cap_mw
EP.ext[:cMaxCap][y] = @constraint(EP, EP[:eTotalCap][y] <= cap) ## supposed to be new value on RHS
println("Constraint for $y: ", EP.ext[:cMaxCap][y]) ## Bug here: results shows old values on RHS
end

I confirm the old constraints have been successfully deleted. As the screenshot shown, gen[y].max_cap_mw are new values, 0.1158 etc., but the constraints remain =< -0.001. -0.001 is the old initial value of gen[y].max_cap_mw. Why the RHS of constraints remain -0.001 instead of 0.1158?

I have tried many other forms for deleting and adding constraints. For example, @constraint(EP, cMaxCap[y in MAX_CAP], EP[:eTotalCap][y]<=gen[y].max_cap_mw), but it remains the same bug. So I doubt it is not a syntax issue.

The LHS of the constraints, EP[:eTotalCap][y], is an complex expression. They are not numeric but symbolic. LHS is an10-element Vector{AffExpr}:
0.25 _[40657]
0.25 _[40658]
0.25 _[40659]
_[40660] + 0.1168458845
_[40661] + 0.072127429 …

I am really appreciated if anyone could provide some advice on this problem.

Hi @Wendy_Weng, welcome to the forum :smile:

It’s a bit hard to know what’s going on here. It probably depends a lot on how you have structured your code.

Can you provide a reproducible example that I can run to reproduce your issue?

As an alternative suggestion, rather than deleting and re-adding constraints, you may find it easier to manage the cap as a fixed variable. For example, do something like:

using JuMP
model = Model()
@variable(model, x[1:10, 1:3] >= 0)
@expression(model, eTotalCap[i in 1:10], sum(x[i, :]))
@variable(model, cap[1:10])
@constraint(model, cMaxCap[i in 1:10], eTotalCap[i] <= cap[i])
for i in 1:10
    fix(cap[i], gen[i].max_cap_mw)
end
optimize!(model)
# change gen[i].max_cap_mw
# Update the cap
for i in 1:10
    fix(cap[i], gen[i].max_cap_mw)
end
# Re-optimize
optimize!(model)

Now your constraints don’t change, and you can JuMP.fix the cap to the appropriate value in each iteration.

See this part of the JuMP documentation: Constraints · JuMP