MINLP optimization with user defined objective function

I’m still very new with Julia, and I’m having troubles with my MINLP optimization problem that I used to solve with MATLAB.
I can’t find a way to provide an user-defined objective function to the @NLobjective macro.
The problem is that my objective function cannot be defined (like in most examples) in an analytic way as a function of my variables.
Here’s a MWE (actually, in this particular case, I could have expressed myfunction in a row, but what I need is to do that with an “outer” function, as I did):

using JuMP,Ipopt,Cbc,Juniper
optimizer = Juniper.Optimizer
params = Dict{Symbol,Any}()
params[:nl_solver] = with_optimizer(Ipopt.Optimizer, print_level=0)
m = Model(with_optimizer(optimizer, params))

@variable(m, 0 <= x[1:4] <= 6, Int)

function myfunction(x1,x2,x3,x4)
    return sum(xx[i] for i = 1:4)
register(m, :myfunction, 4, myfunction; autodiff = true)



The problem that arises is:

KeyError: key :myfunction not found

Any help is really appreciated. Thanks a lot!

I haven’t tested this, so there might be a typo, but you need to tell Juniper that you are registering the functions.

using JuMP
using Ipopt
using Juniper

function myfunction(x1, x2, x3, x4)
    xx=[x1, x2, x3, x4]
    return sum(xx[i] for i = 1:4)

model = Model(
            nl_solver = with_optimizer(Ipopt.Optimizer, print_level = 0),
            registered_functions = [
                Juniper.register(:myfunction,  4, myfunction; autodiff = true)
@variable(model, 0 <= x[1:4] <= 6, Int)
register(model, :myfunction, 4, myfunction; autodiff = true)
@NLobjective(model, Max, myfunction(x[1], x[2], x[3], x[4]))

cc @Wikunia

(Edit: I opened an issue https://github.com/lanl-ansi/Juniper.jl/issues/173)


Thanks for linking me. Yes currently this is the way to go even though it is ugly.

Thanks a lot. This is exactly what I needed :wink: