Dual vs shadow price: Which one to use?

I want to make sure I understand correctly the difference between the dual/shadow_price returned by JuMP. Let’s consider the following primal problem:

using JuMP, Gurobi
begin
    model = Model(Gurobi.Optimizer)
    @variable(model, x[1:3] >= 0)
    @objective(model, Max, 5x[1] + 12x[2] + 4x[3])
    @constraint(model, c1, x[1]+2x[2]+x[3] <= 10)
    @constraint(model, c2, 2x[1] - x[2] + 3x[3] == 8)
    optimize!(model)
end

println("Dual c1: ", dual(c1)) # returns -5.8
println("Dual c2: ", dual(c2)) # returns 0.4
println("Shadow price c1: ", shadow_price(c1)) # returns 5.8
println("Shadow price c2: ", shadow_price(c2)) # returns 0.4

Now, let’s solve the dual problem using JuMP

 begin
    model2 = Model(Gurobi.Optimizer)
    @variable(model2, y1 >= 0)
    @variable(model2, y2)
    @objective(model2, Min, 10y1+8y2)
    @constraint(model2, d1, y1+2y2 >= 5)
    @constraint(model2, d2, 2y1-y2 >= 12)
    @constraint(model2, d3, y1+3y2 >= 4)
    optimize!(model2)
end

println("y1: ", value(y1)) # returns 5.8
println("y2: ", value(y2)) # returns -0.4

Now, y1, y2 correspond to the dual variables for the primal problem, and I was expecting their values to equal values obtained either via dual or shadow_price functions; however y1 = shadow_price(c1) and y2 = -dual(c2) = -shadow_price(c2).

I am solving a problem using column generation, and I want to extract the dual variables corresponding to the constraints in the master problem (MP). For my problem, all my constraints are equality constraints (convexity constraints and complicating constraints). So, I just want to solve the primal version of the MP and extract the correct dual variable values using JuMP directly. Which function should I be using (dual/shadow_price) and is there anything else I need to be careful about while extracting the dual values?

Thanks!

2 Likes

dual seems to return the solution of this equivalent dual formulation:

begin
    model2 = Model(Gurobi.Optimizer)
    @variable(model2, λ1 <= 0)
    @variable(model2, λ2)
    @objective(model2, Min, -10λ1 - 8λ2)
    @constraint(model2, d1, -λ1 - 2λ2 >= 5)
    @constraint(model2, d2, -2λ1 + λ2 >= 12)
    @constraint(model2, d3, -λ1 - 3λ2 >= 4)
    optimize!(model2)
end

println("λ1: ", value(λ1)) # returns -5.8
println("λ2: ", value(λ2)) # returns 0.4

with \lambda_1=-y_1 and \lambda_2=-y_2.

According to the documentation, shadow_price is the same as dual except that the sign is modified depending on constraint type and max/min in the objective.
You can see how it’s computed from dual in the source code.

2 Likes

See this discussion in the documentation: Constraints · JuMP

1 Like

Thanks for the discussion link. So, if I understand correctly, based on some trial and error code and also based on @BatyLeo 's reply:

  1. If I am trying to extract dual variable values for a maximization problem, dual would actually return the negative of the dual values obtained by the actual dual problem.
  2. For a minimization problem, dual would return the same values as obtained by the actual dual problem.

Is that right? Or am I still missing something? Also, since my master problem is a minimization problem, I should be fine using the dual function to extract the dual variable values?

  1. If I am trying to extract dual variable values for a maximization problem, dual would actually return the negative of the dual values obtained by the actual dual problem.

Not quite. The answer is it depends what you mean by “dual.”

There are two common conventions, which in the maximization case return the negative of each other.

If you want a conic dual, which means that regardless of the objective sense, the dual of a greater than constraint is non-negative, and the dual of a less than constraint is non-positive, then use dual.

If you expect the sign of the dual to depend on the objective sense, then use shadow_price.

If your problem is a minimization, then dual and shadow_price are equivalent.

The distinction is just a convention. There is no right or wrong answer, and you’ll find different textbooks using both approaches.

1 Like

Got it. Thanks!

Oh, I am not sure I get this, but the following minimization problem gives sign-wise opposite values using dual and shadow_price:

begin
           model2 = Model(Gurobi.Optimizer)
           @variable(model2, y1 >= 0)
           @variable(model2, y2)
           @objective(model2, Min, 10y1+8y2)
           @constraint(model2, d1, y1+2y2 >= 5)
           @constraint(model2, d2, 2y1-y2 >= 12)
           @constraint(model2, d3, y1+3y2 >= 4)
           optimize!(model2)
       end

Ha. That’ll teach me for assuming. Even I find it hard to keep the conventions the right way around.

When it’s a less-than, we negative when maximization:

When it’s a greater-than, we negate when minimization:

1 Like

Ah, this explains the behavior. Thanks again!

1 Like