JuMP.shadow_price conveys a misconception?

In constraint optimization, the shadow price is the change, per infinitesimal unit of the constraint, in the optimal value of the objective function, obtained by relaxing the constraint.

This is the wikipedia definition of the shadow price, which is also adopted by JuMP.
We can infer from it that shadow price is a primal-side quantity which is unrelated to the dual side.

Let’s see this LP example, in which 3 * y >= x, namely μ is the constraint that we aim to study

primal = JuMP.Model(Gurobi.Optimizer)
JuMP.@variable(primal, 0 <= y); JuMP.@variable(primal, 0 <= x)
JuMP.@objective(primal, Min, 2 * y + 0.7 * x)
JuMP.@constraint(primal, μ, 3 * y >= x)
# For ease of reference
# The dual program of `primal` is the following 4 lines
# Max 0
#     -3 μ >= -2 [⇐ y ≥ 0]
#     μ >= -0.7  [⇐ x ≥ 0]
#     μ >= 0

If we want to derive the shadow price by definition, we should add a parameter p and update the constraint to 3 * y + p >= x. Here p should be a small positive number, where “small” is due to the defining word “infinitesimal”, and “positive” is due to the defining word “relaxing”.
Considering the updated LP, we conclude that

  1. The optimal objective value can never be negative.
  2. when x = y = 0, the primal problem is feasible with objective attaining 0

Therefore, the optimal objective value (after or before perturbed by p) is always 0.
Hence, the difference is 0. Therefore, after get divided by p, it is still 0.

We conclude that the shadow price of the μ constraint is 0.

Nevertheless, the shadow_price function in JuMP is associated to the dual—a totally distinct concept. Secondarily, since the JuMP.dual function depends on the solver, it is subject to numeric precision, thereby cannot return a precise “change rate”.

The following runnable code shows that JuMP may provide a false result (🍅).
(false means being inconsistent with the aforementioned definition)

import JuMP, Gurobi
model = JuMP.Model(Gurobi.Optimizer)
JuMP.@variable(model, 0 <= y); JuMP.@variable(model, 0 <= x)
JuMP.@objective(model, Min, 2 * y + 0.7 * x)
JuMP.@constraint(model, μ, 3 * y >= x)
JuMP.optimize!(model); JuMP.assert_is_solved_and_feasible(model; allow_local = false, dual = true)
@assert JuMP.shadow_price(μ) == -1 * JuMP.dual(μ)
JuMP.shadow_price(μ) # -0.6666666666666666 🍅

Put simply, the following 2 statements (🔵 and 🔴) are inconsistent

help?> JuMP.shadow_price
  shadow_price(con_ref::ConstraintRef)

  🔵Return the change in the objective from an infinitesimal relaxation of the constraint.

  🔴The shadow price is computed from dual and can be queried only when the objective sense is     
  MIN_SENSE or MAX_SENSE (not FEASIBILITY_SENSE).

🔵 adopts the wikipedia definition, which is purely primal-side.
🔴 depends wholly on the dual-side, and is thereby another concept.

The shadow price is defined exactly in terms of the dual variables (really just flipping the sign in some cases). They don’t represent different concepts.

Your confusion stems from the fact that your example has a redundant constraint/the primal is degenerate/there are multiple optimal dual solutions.

To see this, try writing the primal as:

primal = Model()
@variable(primal, x)
@variable(primal, y)
@objective(primal, Min, 2 * y + 0.7 * x)
@constraint(primal, c1, 3 * y >= x)
@constraint(primal, c2, x >= 0)
@constraint(primal, c3, y >= 0)

The single sentence definition of the shadow price is the commonly used definition, and sure, it doesn’t account for some of the edge-cases that can happen. This gets back at something we discussed a few days ago: to what extent should the documentation of JuMP also be a textbook on general optimization concepts?

I don’t know that the docstring of shadow_price needs a lot of detail. Shadow price is a commonly used term of art in the field, and people can refer to other sources for the various technical aspects.

I wrote a note on this to facilitate future reference

About that docstring, I think at least we can swap its 2nd and 1st statement.