Derivative of variable with respect to constraint?

I have a Linear Programming problem that I am modeling using JuMP. If I understand correctly, JuMP.getdual(cons) evaluated on a constrain gives the rate of change of the objective if the RHS of the constrain is slightly increased.

I would like to evaluate the derivative of a variable with respect to a constraint (assuming that the optimal solution is unique). Is there a facility in JuMP to do this?

Not that I’m aware of. You’ll have to derive this from basic LP sensitivity analysis.

Take a look to juliaopt notebooks, there´s a sensitivity analisis notebook that emulates the excel solver sensitivity report. But you will need the Gurobi solver for evaluate the RHS changes. With solvers like Cbc and Clp you only can get the shadow prices and the reduced costs. If you need the allowable increase or decrease for RHS and variables, you can get it with Gurobi. I’m trying to get it with “coin or solvers” but still can’t right now.

sIPOPT has support for sensitivity analysis calculations. Although this would be overkill for your problem (as it is a NLP solver), it might be helpful now or in the future.

https://projects.coin-or.org/Ipopt/wiki/sIpopt

Where can I find the “juliaopt notebooks”?