Nonlinear objective with both splatted variable and vector parameters

Hi,

I’m currently writing a model predictive control package. The JuMP package is really helpful, thanks for everything.

For nonlinear programming, my decision variable is a vector so I need to rely on the splatting syntax. I also need vector nonlinear parameters with @NLparameter(model, x[i=1:n] == x_val[i]). Knowing that :

The expression splatted can be only a symbol. More complex expressions are not recognized.

is there a way to create an objective function both with nonscalar decision variable and nonscalar parameters as argument ?

For example, I can do:

using JuMP, Ipopt
function myfunc()
    model = Model(Ipopt.Optimizer)
    nvar = 3
    @variable(model, x[1:nvar])
    @NLparameter(model, a == 1)
    f(a, x...) = sum((x .- a).^2)
    register(model, :f, 1+nvar, f; autodiff = true)
    @NLobjective(model, Min, f(a, x...))
    optimize!(model)
    value.(x)
end
myfunc()

but this:

using JuMP, Ipopt
function myfunc()
    model = Model(Ipopt.Optimizer)
    nvar = 3
    @variable(model, x[1:nvar])
    @NLparameter(model, a[i=1:nvar] == 1)
    f(a, x...) = sum((x .- a).^2)
    register(model, :f, 1+nvar, f; autodiff = true)
    @NLobjective(model, Min, f(a, x...))
    optimize!(model)
    value.(x)
end
myfunc()

results in:

ERROR: Unexpected array NonlinearParameter[parameter[1] == 1.0, parameter[2] == 1.0, parameter[3] == 1.0] in nonlinear expression. Nonlinear expressions may contain only scalar expressions.

Thanks for your help,

Francis Gagnon

1 Like

Not the most elegant approach, but you could do this


using JuMP, Ipopt
function myfunc()
    model = Model(Ipopt.Optimizer)
    nvar = 3
    @variable(model, x[1:nvar])
    @NLparameter(model, a[i=1:nvar] == 1)
    f(z...) = sum((z[nvar+1:end] .- z[1:nvar]).^2)
    register(model, :f, 2*nvar, f; autodiff = true)
    @NLobjective(model, Min, f(a..., x...))
    optimize!(model)
    value.(x)
end
myfunc()
1 Like

Thanks for the help, it is indeed a solution. I have multiple vector parameters, so yes it will be a bit messy. But it’s better than no solution!

1 Like

Maybe you could use something like API · ComponentArrays.jl where you construct a ComponentArray with your inputs and then you get the indices via getaxes back, or you might even be able to directly access using labels inside the f function.

1 Like

If the parameters are used exclusively in the registered function, you have the option of hiding them from JuMP via a closure:

using JuMP, Ipopt
function myfunc()
    model = Model(Ipopt.Optimizer)
    nvar = 3
    @variable(model, x[1:nvar])
    a = zeros(nvar)
    f(x...) = sum((x .- a).^2)
    register(model, :f, nvar, f; autodiff = true)
    @NLobjective(model, Min, f(x...))
    optimize!(model)
    @show value.(x)

	a[1] = 10.0
    optimize!(model)
    @show value.(x)

end
myfunc()

In this case it’s on you to be aware of the subtleties around how to update the parameter outside the closure.

3 Likes

If the parameters are used exclusively in the registered function, you have the option of hiding them from JuMP via a closure:

This solution is still applicable if I use my parameters both in the objective function and the nonlinear constraint function (thus two separate registered functions) ?

This solution is still applicable if I use my parameters both in the objective function and the nonlinear constraint function (thus two separate registered functions) ?

Yes

1 Like

Great thanks! I’ll rely on your solution, it’s a bit tiddier than splating all the arguments.