In constraint optimization, the shadow price is the change, per infinitesimal unit of the constraint, in the optimal value of the objective function, obtained by relaxing the constraint.
This is the wikipedia definition of the shadow price, which is also adopted by JuMP.
We can infer from it that shadow price is a primal-side quantity which is unrelated to the dual side.
Let’s see this LP example, in which 3 * y >= x
, namely μ
is the constraint that we aim to study
primal = JuMP.Model(Gurobi.Optimizer)
JuMP.@variable(primal, 0 <= y); JuMP.@variable(primal, 0 <= x)
JuMP.@objective(primal, Min, 2 * y + 0.7 * x)
JuMP.@constraint(primal, μ, 3 * y >= x)
# For ease of reference
# The dual program of `primal` is the following 4 lines
# Max 0
# -3 μ >= -2 [⇐ y ≥ 0]
# μ >= -0.7 [⇐ x ≥ 0]
# μ >= 0
If we want to derive the shadow price by definition, we should add a parameter p
and update the constraint to 3 * y + p >= x
. Here p
should be a small positive number, where “small” is due to the defining word “infinitesimal”, and “positive” is due to the defining word “relaxing”.
Considering the updated LP, we conclude that
- The optimal objective value can never be negative.
- when
x = y = 0
, the primal problem is feasible with objective attaining0
Therefore, the optimal objective value (after or before perturbed by p
) is always 0
.
Hence, the difference is 0
. Therefore, after get divided by p
, it is still 0
.
We conclude that the shadow price of the μ
constraint is 0
.
Nevertheless, the shadow_price
function in JuMP is associated to the dual
—a totally distinct concept. Secondarily, since the JuMP.dual
function depends on the solver, it is subject to numeric precision, thereby cannot return a precise “change rate”.
The following runnable code shows that JuMP may provide a false result (🍅
).
(false means being inconsistent with the aforementioned definition)
import JuMP, Gurobi
model = JuMP.Model(Gurobi.Optimizer)
JuMP.@variable(model, 0 <= y); JuMP.@variable(model, 0 <= x)
JuMP.@objective(model, Min, 2 * y + 0.7 * x)
JuMP.@constraint(model, μ, 3 * y >= x)
JuMP.optimize!(model); JuMP.assert_is_solved_and_feasible(model; allow_local = false, dual = true)
@assert JuMP.shadow_price(μ) == -1 * JuMP.dual(μ)
JuMP.shadow_price(μ) # -0.6666666666666666 🍅