Ipopt and external function

I am trying to solve a non-linear function using JuMP. From what I saw, I have to use Ipopt and, as https://jump.dev/JuMP.jl/v0.20.0/nlp/ says, I cannot directly include the function in the NLobjective. To get around this I define a new variable aux and equate it to my external function as suggested (see below) but this does not seem to work. Any ideas what is wrong? Open to other solutions too.

model = Model(Ipopt.Optimizer)
function fopt(x,y) 
  return (1 - x)^2 + 100 * (y - x^2)^2;
@variable(model, x, start = 0.0)
@variable(model, y, start = 0.0)
@variable(model, aux)
@constraint(model, aux == fopt)
@NLobjective(model, Min, aux)

I know I could just put the output of fopt in the NLobjective, but I would like to know whether there is a way to do this using this approach. At some point, I will have a much more complicated function that I want to minimize.

Thank you!

I think you want to register the function as in: https://jump.dev/JuMP.jl/latest/nlp/#User-defined-Functions-1

I got a solution of (x=1.0000000000034162, y=1.0000000000069469) with the following code:

using JuMP
using Ipopt

model = Model(Ipopt.Optimizer)
function fopt(x,y) 
  return (1 - x)^2 + 100 * (y - x^2)^2;
@variable(model, x, start = 0.0)
@variable(model, y, start = 0.0)

register(model, :fopt, 2, fopt, autodiff=true)

@NLobjective(model, Min, fopt(x,y))

Thank you! This works great!

Is this how one would solve these sorts of equations in Julia?

In particular, I usually have a very complicated f() (calling many other functions, etc) that I want to minimize but also constraining the space in which the inputs of f can be.

Do you want a best solution (i.e., global optima) or a good-enough solution (local optima)?

Global (within the support).

Well, seems like Ipopt works for global optimization:

Global convergence of the method is ensured by a line search procedure, based on a filter method.

I have searched for global solutions for non-linear function only a single time, and I remember that if you want to be able to use a function that may do anything, then you do not have many options. If you just need something that is non-linear but very restricted like two variables multiplying each other, then many commercials solvers have support to that kinda of non-linearity (i.e., quadratic function).

1 Like

You could achieve that by adding bounds to the variables @variable(model, 2 <= x <= 10, start = 0.0)

1 Like

seems like Ipopt works for global optimization

This is only correct if the problem is convex.

1 Like

I don’t think “global” in this quote means what you think. I just think means that it will converge, to a local optimum, wherever you start from. Actually, the Ipopt docs say

Ipopt implements an interior point line search filter method that aims to find a local solution of (NLP).


It seems I was wrong. It is really unfortunate that their FAQ use the term global to mean local, the two adjectives have a well-defined and opposite meanings in optimization. The entire FAQ page gives no hint it only solves convex problems optimally and that nonconvex problems are limited to local optima.

Yes, I can confirm that it is indeed a local solution method. Just a simple example:

function f(x)
    return abs(5*x^2-(2*x-3)^3);
obj(x) = f(pars,x);
model = Model(with_optimizer(Ipopt.Optimizer, max_iter=100, print_level=0));
#adding variables
@variable(model, 0 <= x <= 10);
#register function
register(model, :obj, 1, obj, autodiff=true);
@NLobjective(model, Min, obj(x));
@time xsol_jump = JuMP.optimize!(model);
println("\nValue of x* is = ",string(value(x)))
println("Value of f(x*) is = ",string(f(value(x))))
println("...but Value of f(3.453) is = ",string(f(3.453)))


Another remark about this point. JuMP’s solver status values provide an indication about properties of the solution that is returned. In the case of Ipopt, the termination status is LOCALLY_SOLVED, which indicates that it only ensures local optimality condition. Solvers that provide global optimality conditions would return the status OPTIMAL.