I am trying to solve a non-linear function using JuMP. From what I saw, I have to use Ipopt and, as https://jump.dev/JuMP.jl/v0.20.0/nlp/ says, I cannot directly include the function in the NLobjective. To get around this I define a new variable aux and equate it to my external function as suggested (see below) but this does not seem to work. Any ideas what is wrong? Open to other solutions too.

model = Model(Ipopt.Optimizer)
function fopt(x,y)
return (1 - x)^2 + 100 * (y - x^2)^2;
end
@variable(model, x, start = 0.0)
@variable(model, y, start = 0.0)
@variable(model, aux)
@constraint(model, aux == fopt)
@NLobjective(model, Min, aux)

I know I could just put the output of fopt in the NLobjective, but I would like to know whether there is a way to do this using this approach. At some point, I will have a much more complicated function that I want to minimize.

Is this how one would solve these sorts of equations in Julia?

In particular, I usually have a very complicated f() (calling many other functions, etc) that I want to minimize but also constraining the space in which the inputs of f can be.

I have searched for global solutions for non-linear function only a single time, and I remember that if you want to be able to use a function that may do anything, then you do not have many options. If you just need something that is non-linear but very restricted like two variables multiplying each other, then many commercials solvers have support to that kinda of non-linearity (i.e., quadratic function).

I donâ€™t think â€śglobalâ€ť in this quote means what you think. I just think means that it will converge, to a local optimum, wherever you start from. Actually, the Ipopt docs say

Ipopt implements an interior point line search filter method that aims to find a local solution of (NLP).

It seems I was wrong. It is really unfortunate that their FAQ use the term global to mean local, the two adjectives have a well-defined and opposite meanings in optimization. The entire FAQ page gives no hint it only solves convex problems optimally and that nonconvex problems are limited to local optima.

Yes, I can confirm that it is indeed a local solution method. Just a simple example:

function f(x)
return abs(5*x^2-(2*x-3)^3);
end
obj(x) = f(pars,x);
model = Model(with_optimizer(Ipopt.Optimizer, max_iter=100, print_level=0));
#adding variables
@variable(model, 0 <= x <= 10);
#register function
register(model, :obj, 1, obj, autodiff=true);
@NLobjective(model, Min, obj(x));
@time xsol_jump = JuMP.optimize!(model);
println("\nValue of x* is = ",string(value(x)))
println("Value of f(x*) is = ",string(f(value(x))))
println("...but Value of f(3.453) is = ",string(f(3.453)))

Another remark about this point. JuMPâ€™s solver status values provide an indication about properties of the solution that is returned. In the case of Ipopt, the termination status is LOCALLY_SOLVED, which indicates that it only ensures local optimality condition. Solvers that provide global optimality conditions would return the status OPTIMAL.