Result_dual vs getdual. signs are the opposite

JuMP v0.19:

using JuMP, Clp
m = Model(with_optimizer(Clp.Optimizer))
@variable(m, y>=0)
@objective(m, Max, y)
@constraint(m, con, y <= 3)
JuMP.optimize!(m)
pi_val = JuMP.result_dual(con)
@show pi_val

I obtained pi_val = -1.0

JuMP v0.18:

using JuMP, Clp
m = Model(solver=ClpSolver())
@variable(m, y>=0)
@objective(m, Max, y)
@constraint(m, con, y <= 3)
solve(m)
pi_val = getdual(con)
@show pi_val

I obtained pi_val = 1.0

Why does this happen? This is independent from the solvers, so I guess something changed in JuMP or MOI.

This issue doesn’t seem to happen when minimizing. I guess -1 should be multiplied to the dual value when maximizing?

This has to do with the duality convention in MOI: http://www.juliaopt.org/MathOptInterface.jl/stable/apimanual.html#Duals-1

From the manual:

An important note for the LP case is that the signs of the feasible duals depend only on the sense of the inequality and not on the objective sense

4 Likes

This was announced as a breaking change at MathOptInterface and upcoming breaking changes in JuMP 0.19 but didn’t make it into the release notes. I’ll add it.

3 Likes

Thanks @miles.lubin and @joaquimg. I missed the notes.

I think this will make teaching LP courses a little more challenging, as the sign of dual is the opposite of most LP textbooks’ convention (both undergraduate and graduate). And seemingly it weakens the meaning of shadow price. Not sure what can be done though. A training wheel package? Maybe just some additional explanation in the classroom.

I see you point, the problem is that solvers are not consistent on what to do with signs of duals.
We decided to use the conic notation, which makes harder for people learning LP.
I agree that losing the shadow price interpretation is bad.
Maybe we can have a lp_shadow_price function which returns the modified duals for LPs and breaks on conic problems, just brainstorming…

1 Like

Yes, the change was for consistency with conic duality. Personally I find the new rule pretty simple to remember since the valid sign of the dual depends only on the sense of the constraint. I might be wrong but I think CVX does the same.

Sounds reasonable. That could go with a toolkit for teaching LP that also includes sensitivity summaries (Feature Request: sensitivity summary for LPs · Issue #1332 · jump-dev/JuMP.jl · GitHub).

3 Likes

I can’t check this myself since I’m traveling and my connection is too shaky, but do you know what duality sign conventions are used in Ampl, GAMS, and the various Python packages? I suspect many JuMP users will have a lot of experience with these environments, and if JuMP behaves differently then this is bound to cause headaches down the road.

If the current definition stays then +1 for providing lp_shadow_price.

+1 for LP teaching toolkit or LP toolbox in general. dual + sensitivity + termination status, etc.

Can result_dual be result_conic_dual and result_lp_dual? Would it be too much?

We’ll provide JuMP.shadow_price. It’s a well-defined concept for LPs, i.e., sensitivity of the objective to an infinitesimal relaxation of the constraint.

You can see where it breaks down with homogeneous (conic) constraints like x \in C for some cone C because it’s more ambiguous about what it means to relax the constraint. Conic duals are based instead on supporting hyperplanes to C.

No, I don’t. I’d be interested to see a survey of how duals are handled across modeling interfaces. This sign change is a result of headaches that already occurred when trying to support conic optimization models (e.g., SOCP, SDP). AFAIK these interfaces have no first-class support for SDP so they’re have less complexity to deal with.

Follow-up PR implementing shadow_price: Implement shadow_price by mlubin · Pull Request #1568 · jump-dev/JuMP.jl · GitHub.

JuMP indeed has annoying shadow price behavior. This issue, first raised by @chkwon also matters in the presence of non-linear models/constraints in JuMP.

The behavior is not there in GAMS. My model is an economic growth model. Took me a while to actually figure out what was going on—kind of annoying as I could have saved time, had I worked right away in GAMS. But I love JuMP/Julia and the flexibility it offers.

I had to re-program everything in GAMS and also derive the complementarity problem to be sure I was not making errors.

Moreover, “JuMP. shadow_price” does not work with “NLconstraint”. Non linear constraints are the bread and butter of economic modelling. So shadow_price wont work in this case.

I do not want to come off as ungrateful as I cannot do the great work that is being done and has been done with JuMP. But it would be great to address this issue once and for all. This is likely to affect new users.

I see three ways out. 1) make shadow_price work with nonlinear constraints, 2) have JuMP be consistent with GAMS/AMPL—may be by first checking if the problem is Max/Min and the multiplying duals/shadow_prices by -1 for Max. 3) Make it more explicit in the manual what is actually being done by JuMP.dual, in order for people like me to back out the proper shadow prices.

Thanks

@jovansam has created a new topic with basically the same text of this post. Please let us keep this branch of the discussion in a single place and answer to this post there.

3 Likes