Hi @disberd,
As usual, you know how to talk to a computing moron. I did as you mentioned, and now I can solve the optimization problem using JuMP inside Pluto.
Usually, there are three steps to go through using JuMP. The first is to set up the model and print it out: Print(m1)
, in this example. There is no problem here; Pluto displays the output correctly:
Min x1² + 0.5 x2² + 0.4 x3²
Subject to
cons1b : 0.5 x1 + x2 == 10.0
cons1c : 0.5 x1 + x3 == 20.0
The second is to display the properties of the solution that was found (not found). When we run optimize!(m1)
, in VScode, Atom, or Jupyter the following information pops up automatically:
******************************************************************************
This program contains Ipopt, a library for large-scale nonlinear optimization.
Ipopt is released as open source code under the Eclipse Public License (EPL).
For more information visit https://github.com/coin-or/Ipopt
******************************************************************************
This is Ipopt version 3.13.4, running with linear solver mumps.
NOTE: Other linear solvers might be more efficient (see Ipopt documentation).
Number of nonzeros in equality constraint Jacobian...: 4
Number of nonzeros in inequality constraint Jacobian.: 0
Number of nonzeros in Lagrangian Hessian.............: 3
Total number of variables............................: 3
variables with only lower bounds: 0
variables with lower and upper bounds: 0
variables with only upper bounds: 0
Total number of equality constraints.................: 2
Total number of inequality constraints...............: 0
inequality constraints with only lower bounds: 0
inequality constraints with lower and upper bounds: 0
inequality constraints with only upper bounds: 0
iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls
0 0.0000000e+00 2.00e+01 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0
1 1.7551020e+02 0.00e+00 0.00e+00 -1.0 1.73e+01 - 1.00e+00 1.00e+00h 1
Number of Iterations....: 1
(scaled) (unscaled)
Objective...............: 1.7551020408163265e+02 1.7551020408163265e+02
Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00
Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00
Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00
Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00
Number of objective function evaluations = 2
Number of objective gradient evaluations = 2
Number of equality constraint evaluations = 2
Number of inequality constraint evaluations = 0
Number of equality constraint Jacobian evaluations = 1
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations = 1
Total CPU secs in IPOPT (w/o function evaluations) = 0.237
Total CPU secs in NLP function evaluations = 0.104
EXIT: Optimal Solution Found.
However, to get this information in Pluto, I had to resort to the terminal:
with_terminal() do
optimize!(m1)
end
The third is to display the model’s output, which consists of 6 elements in this case:
Objective value: 175.51020408163265
x1 = 5.3061224489795915
x2 = 7.346938775510204
x3 = 17.346938775510203
Lagrangian Multiplier for constraint (1b) = 7.346938775510204
Lagrangian Multiplier for constraint (1c) = 13.877551020408163
Again I had to resort to the terminal to get a table-like display:
with_terminal() do
println("Objective value = ", objective_value(m1))
println("z1 = ", getvalue(z1))
println("z2 = ", getvalue(z2))
println("z3 = ", getvalue(z3))
println("Lagrangian Multiplier for constraint (1b) = ", getdual(cons1b))
println("Lagrangian Multiplier for constraint (1c) = ", getdual(cons1c))
end
@disberd, thanks. Now, I can finish my set of Pluto notebooks. In this migration process, you started helping me by providing a solution to integrate PlotlyJS into Pluto and sorting out small obstacles that appeared along the way. It works immaculately. Then, you finished it with JuMP. I owe you a lot. Thank you very, very much.