I think this will make teaching LP courses a little more challenging, as the sign of dual is the opposite of most LP textbooks’ convention (both undergraduate and graduate). And seemingly it weakens the meaning of shadow price. Not sure what can be done though. A training wheel package? Maybe just some additional explanation in the classroom.
I see you point, the problem is that solvers are not consistent on what to do with signs of duals.
We decided to use the conic notation, which makes harder for people learning LP.
I agree that losing the shadow price interpretation is bad.
Maybe we can have a lp_shadow_price function which returns the modified duals for LPs and breaks on conic problems, just brainstorming…
Yes, the change was for consistency with conic duality. Personally I find the new rule pretty simple to remember since the valid sign of the dual depends only on the sense of the constraint. I might be wrong but I think CVX does the same.
I can’t check this myself since I’m traveling and my connection is too shaky, but do you know what duality sign conventions are used in Ampl, GAMS, and the various Python packages? I suspect many JuMP users will have a lot of experience with these environments, and if JuMP behaves differently then this is bound to cause headaches down the road.
If the current definition stays then +1 for providing lp_shadow_price.
We’ll provide JuMP.shadow_price. It’s a well-defined concept for LPs, i.e., sensitivity of the objective to an infinitesimal relaxation of the constraint.
You can see where it breaks down with homogeneous (conic) constraints like x \in C for some cone C because it’s more ambiguous about what it means to relax the constraint. Conic duals are based instead on supporting hyperplanes to C.
No, I don’t. I’d be interested to see a survey of how duals are handled across modeling interfaces. This sign change is a result of headaches that already occurred when trying to support conic optimization models (e.g., SOCP, SDP). AFAIK these interfaces have no first-class support for SDP so they’re have less complexity to deal with.
JuMP indeed has annoying shadow price behavior. This issue, first raised by @chkwon also matters in the presence of non-linear models/constraints in JuMP.
The behavior is not there in GAMS. My model is an economic growth model. Took me a while to actually figure out what was going on—kind of annoying as I could have saved time, had I worked right away in GAMS. But I love JuMP/Julia and the flexibility it offers.
I had to re-program everything in GAMS and also derive the complementarity problem to be sure I was not making errors.
Moreover, “JuMP. shadow_price” does not work with “NLconstraint”. Non linear constraints are the bread and butter of economic modelling. So shadow_price wont work in this case.
I do not want to come off as ungrateful as I cannot do the great work that is being done and has been done with JuMP. But it would be great to address this issue once and for all. This is likely to affect new users.
I see three ways out. 1) make shadow_price work with nonlinear constraints, 2) have JuMP be consistent with GAMS/AMPL—may be by first checking if the problem is Max/Min and the multiplying duals/shadow_prices by -1 for Max. 3) Make it more explicit in the manual what is actually being done by JuMP.dual, in order for people like me to back out the proper shadow prices.