Is there a recommended way to perform sensitivity analysis of the optimization target to different constraints?

What I mean by that is a way to determine which constraints are most limiting to the objective function. For example, in this MWE I’d like to be able to ascertain which constraint could be relaxed to achieve the most benefit to the optimization target:

```
using JuMP
using GLPK
function optimizer(limitall,limitone)
model = Model(GLPK.Optimizer)
@variable(model, x[1:5])
@constraint(model, x .<= limitall)
@constraint(model, x[1] .<= limitone)
vals = [1:5...]
@objective(model, Max, sum(x .* vals) )
optimize!(model)
return model
end
m = optimizer(10,5)
objective_value(m) # 145
value.(m[:x]) # [5.0, 10.0, 10.0, 10.0, 10.0]
```

In this case, if `limitall`

went from `10`

to `11`

then my objective would improve by `4`

, while `limitone`

going from `5`

to `6`

would improve it by `1`

.

In practice, I have a model like this that has 100+ constraints and I’m trying to determine which are most binding.

I have seen the function `JuMP.lp_sensitivity_report(m)`

- what it seems to tell me is a different question: how much the constraints can change without affecting the objective value.

I have also tried wrapping the `optimizer`

in a function `f`

that takes the two limits and returns the objective value. Zygote did not differentiate the function with the error:

```
Compiling Tuple{typeof(MathOptInterface.add_variable), MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{GLPK.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}: try/catch is not supported.
```