Result_dual vs getdual. signs are the opposite


#1

JuMP v0.19:

using JuMP, Clp
m = Model(with_optimizer(Clp.Optimizer))
@variable(m, y>=0)
@objective(m, Max, y)
@constraint(m, con, y <= 3)
JuMP.optimize!(m)
pi_val = JuMP.result_dual(con)
@show pi_val

I obtained pi_val = -1.0

JuMP v0.18:

using JuMP, Clp
m = Model(solver=ClpSolver())
@variable(m, y>=0)
@objective(m, Max, y)
@constraint(m, con, y <= 3)
solve(m)
pi_val = getdual(con)
@show pi_val

I obtained pi_val = 1.0

Why does this happen? This is independent from the solvers, so I guess something changed in JuMP or MOI.


#2

This issue doesn’t seem to happen when minimizing. I guess -1 should be multiplied to the dual value when maximizing?


#3

This has to do with the duality convention in MOI: http://www.juliaopt.org/MathOptInterface.jl/stable/apimanual.html#Duals-1

From the manual:

An important note for the LP case is that the signs of the feasible duals depend only on the sense of the inequality and not on the objective sense


#4

This was announced as a breaking change at MathOptInterface and upcoming breaking changes in JuMP 0.19 but didn’t make it into the release notes. I’ll add it.


#5

Thanks @miles.lubin and @joaquimg. I missed the notes.

I think this will make teaching LP courses a little more challenging, as the sign of dual is the opposite of most LP textbooks’ convention (both undergraduate and graduate). And seemingly it weakens the meaning of shadow price. Not sure what can be done though. A training wheel package? Maybe just some additional explanation in the classroom.


#6

I see you point, the problem is that solvers are not consistent on what to do with signs of duals.
We decided to use the conic notation, which makes harder for people learning LP.
I agree that losing the shadow price interpretation is bad.
Maybe we can have a lp_shadow_price function which returns the modified duals for LPs and breaks on conic problems, just brainstorming…


#7

Yes, the change was for consistency with conic duality. Personally I find the new rule pretty simple to remember since the valid sign of the dual depends only on the sense of the constraint. I might be wrong but I think CVX does the same.

Sounds reasonable. That could go with a toolkit for teaching LP that also includes sensitivity summaries (https://github.com/JuliaOpt/JuMP.jl/issues/1332).


#8

I can’t check this myself since I’m traveling and my connection is too shaky, but do you know what duality sign conventions are used in Ampl, GAMS, and the various Python packages? I suspect many JuMP users will have a lot of experience with these environments, and if JuMP behaves differently then this is bound to cause headaches down the road.

If the current definition stays then +1 for providing lp_shadow_price.


#9

+1 for LP teaching toolkit or LP toolbox in general. dual + sensitivity + termination status, etc.

Can result_dual be result_conic_dual and result_lp_dual? Would it be too much?


#10

We’ll provide JuMP.shadow_price. It’s a well-defined concept for LPs, i.e., sensitivity of the objective to an infinitesimal relaxation of the constraint.

You can see where it breaks down with homogeneous (conic) constraints like x \in C for some cone C because it’s more ambiguous about what it means to relax the constraint. Conic duals are based instead on supporting hyperplanes to C.

No, I don’t. I’d be interested to see a survey of how duals are handled across modeling interfaces. This sign change is a result of headaches that already occurred when trying to support conic optimization models (e.g., SOCP, SDP). AFAIK these interfaces have no first-class support for SDP so they’re have less complexity to deal with.


#11

Follow-up PR implementing shadow_price: https://github.com/JuliaOpt/JuMP.jl/pull/1568.