Setting nonlinear objectives with JuMP

Dear All, one year ago I have written a small note on how to use Julia from R through the ‘JuliaCall’ package. I wanted to update this page and encountered some problems with Julia’s JuMP package.
See Notes on JuliaCall at https://hwborchers.github.io/.

The problem I wanted to solve is minimizing the (generalized) Rosenbrock funtion with constraints 0.0 <= x_i and sum(x)==1. Rosenbrock is defined as

    function rosen(x...)
        n = length(x); s = 0.0
        for i = 1:length(x)-1
            s += 100*(x[i+1] - x[i]^2)^2 + (x[i] - 1)^2
        end
        return s
    end

Here is the version for minimizing Rosenbrock (with constraints 0.0<=x_i<=0.5 only) that was correct last year:

01  using JuMP, Ipopt
02  m = Model(solver = IpoptSolver());

03  @variable(m, 0.0 <= x[1:10] <= 0.5);
04  for i in 1:10 setvalue(x[i], 0.1); end;

05  JuMP.register(m, :rosen, 10, rosen, autodiff=true);
06  JuMP.setNLobjective(m, :Min,
07                      Expr(:call, :rosen, [x[i] for i=1:10]...));

08  sol = solve(m);
09  getvalue(x)
    ##  [1] 0.5000000000 0.2630659929 0.0800311191 0.0165742352 0.0103806763
    ##  [6] 0.0102120052 0.0102084109 0.0102042121 0.0100040851 0.0001000822

Please note that lines (5-7) have been suggested to me when I asked for help here on the Julia discussion forum.

I have learned by now that I have to make the following changes:

(2) Replace the solver in the Model call with “Model(Ipopt.Optimizer)”
(8) Replace “solve” with “optimize!”
(9) Replace “getvalue(x)” with “JuMP.value.(x)”

It is still unclear to me, and I tried to find something in the documentation:

(5) Do I still need to register the rosen function?
(*) Do I need to set the @NLexpression ?
(6) How to set a nonlinear objective function (with setNLobjective)?

If I can get this example right, I think I can do the rest of my applications alone. Of course, a link to proper documentation pages will also be helpful.

Many thanks.

Is this sufficient?

using JuMP
import Ipopt
m = Model(Ipopt.Optimizer)
@variable(m, 0.0 <= x[1:10] <= 0.5, start = 0.1)
register(m, :rosen, 10, rosen, autodiff=true)
@NLobjective(m, Min, rosen(x...))
optimize!(m)
value(x)

If you want to see the raw expression input, here are the docs; Nonlinear Modeling · JuMP

I’ll also add, the old code you reference is from JuMP v0.18. We changed a lot during the transition to MathOptInterface in 2019. But we recently released JuMP 1.0, JuMP 1.0.0 is released | JuMP, so this is the syntax you can rely on without the risk of things breaking in future.

Thanks a lot. In the meantime, I managed to get this to work. I still think there should be some more examples concerning nonlinear objective functions and constraints. On the other hand: As I got this to run before I read your answer, the docs are not bad :slight_smile: !

function rosen(x...)
    n = length(x); s = 0.0
    for i = 1:length(x)-1
        s += 100*(x[i+1] - x[i]^2)^2 + (x[i] - 1)^2
    end
    return s
end

using JuMP, Ipopt

m = Model(Ipopt.Optimizer);
@variable(m, 0.0 <= x[1:10] <= 1.0);
for i in 1:10 setvalue(x[i], 0.1); end;
@constraint(m, c1, sum(x) == 1.0);

JuMP.register(m, :rosen, 10, rosen, autodiff=true);
@NLexpression(m, e1, rosen(x...));
@NLobjective(m, Min, e1);

sol = optimize!(m);
xmin = JuMP.value.(x)
##  10-element Vector{Float64}:
##   0.5
##   0.2759742637043274
##   0.09633073776545037
##   0.027604360420402374
##   0.018857245635309217
##   0.018439883710793973
##   0.018423364058042388
##   0.018408615621349985
##   0.01803449829941936
##   0.007927020786201111
rosen(xmin...)
##  7.663371563695662

Is this section sufficient? Introduction · JuMP