# Problem with Ntuple and VariableRef when using NLconstraint in JuMP

Hi everybody,

Below is a simplified part of my code, which attempts to do nonlinear optimization using JuMP . I have both cost and constraints which require vector input. However I am having issues with NLconstraint. Would anyone know how I could make this piece of code run? I read somewhere that when using splatting there are limitations (cannot due linear algebra with it), but maybe there is a way around it. I am new to Julia so any advice would be helpful.

using Zygote
using LinearAlgebra
using JuMP
using Ipopt

N=100
d=4
a = (N+1)*d
b = (N+1)*d
c = (N+1)*d

size_overall = a+b+c

print(size_overall,"\n")
#say for this example
A = Matrix(I, d, d)
B = 2 .*Matrix(I, d, d)
#then

func1(q,z) = q’Az+z’Bq;
func2(q,z) = func1((q+z)./2,z)

function cost_func(x…)
dot(x,x)
end

model = Model(Ipopt.Optimizer)
@variable(model,x[1:size_overall])

JuMP.register(model, :f, size_overall, cost_func,autodiff = true)
@NLobjective(model,Min,f(x…))

function func_con(y…)
x=y
f_constr = zeros((N+1)d2)
for k = 1:N+1
f_constr[(k-1)2d+1:(k-1)2d+d] .= x[a+(k-1)d+1:a+kd]+alpha(x[(k-1)d+1:kd],x[k*d+1:(k+1)*d])-x[b+(k-1)*d+1:b+k*d]
f_constr[(k-1)2d+d+1:(k)2d] .= x[a+(k)*d+1:a+(k+1)d]- beta(x[(k-1)d+1:kd],x[kd+1:(k+1)*d])

``````end
return f_constr
``````

end

register(model, :func_con, size_overall, func_con; autodiff = true)
@NLconstraint(model,[i=1:size_overall],func_con(x…)[i]==0)

@time(optimize!(model))

You can convert between tuples and vectors as follows:

``````julia> x = (1,2,3)
(1, 2, 3)

julia> collect(x)
3-element Vector{Int64}:
1
2
3

julia> y = [1, 2, 3]
3-element Vector{Int64}:
1
2
3

julia> tuple(y...)
(1, 2, 3)
``````

You should read: Nonlinear tips and tricks · JuMP

Are all your functions quadratic? (I couldn’t see anything more complicated than that.)

If so, try writing out the problem algebraically without using `@NL` macros. You can do vector-valued quadratic expression in JuMP.

1 Like

Hi, thank you for the response! Yes,my initial func1 could be other than quadratic, including nonlinear.

If I change the line x=y to x=collect(y) an error still appears : MethodError: Cannot `convert` an object of type VariableRef to an object of type Float64…

I do not think the error is in @NLconstraint(model,[i=1:size_overall],func_con(x…)[i]==0) as I have tested it with other simpler constraints and it worked. I could be wrong though.

I tried to make a simpler example with 3 variables and two constraints (I know they are convex), to make the discussion easier.

``````
model = Model(Ipopt.Optimizer)
@variable(model, x[1:3])

# definie cost function
function cost_func(x::T...) where { T <: Real}
return dot(x, x) + dot([1.; 2.; 3.], x)
end

# minimize x^⊤x + q^⊤x
JuMP.register(model, :f, 3, cost_func, autodiff = true)
@NLobjective(model, Min, f(x...))

# make constraint terms as function with vector output
# f_con = x1 - x2
# f_con = x_3 - (x1 + x2)

function func_con(x::T...) where { T <: Real}
f_constr = zeros(T, 2)
f_constr = x - x
f_constr = x - (x + x)
return f_constr
end

# define separate functions for each component of vector valued function
func_con_1(x) = func_con(x)
func_con_2(x) = func_con(x)

# register each component individually
register(model, :func_con_1, 3, func_con_1; autodiff = true)
register(model, :func_con_2, 3, func_con_2; autodiff = true)
@NLconstraint(model,  func_con_1(x...) <= 0)
@NLconstraint(model,  func_con_2(x...) <= 0)
optimize!(model)

``````

This works for 2 constraints, but how would you do it if your function outputs a vector of dimension 100?

@odow I think the question is:
Given a vector-valued function with n-components in the output, can you generate the n nonlinear constraints automatically without manually defining and registering additional functions for each component?

Given a vector-valued function with n-components in the output, can you generate the n nonlinear constraints automatically without manually defining and registering additional functions for each component?

See this tutorial:
https://jump.dev/JuMP.jl/stable/tutorials/Nonlinear%20programs/nlp_tricks/#User-defined-functions-with-vector-outputs
You could also use the raw-expression input to loop over the registration of the functions:
https://jump.dev/JuMP.jl/stable/manual/nlp/#More-complicated-examples