# JuMP nonlinear constraints for different optimisation variables

I would like to define multiple nonlinear constraints in JuMP, where each constraint is applied to a subset of the total optimisation variables. For my concrete use-case, I am trying to constraint a robot motion plan: my optimisation variables are the joint positions over time, defined for a finite set of keyframes of the whole motion plan.

Assuming the robot has 6 joints, and that we know the desired tip {x, y, z} positional trajectory, I am formulating the optimisation problem as follows:

``````# array of desired [x,y,z] over time
desired_tip_position = [TransToRp(T) for T in tip_trajectory]

# robot joint positions to [xyz] position
fkin_position(thetalist) = TransToRp(FKinSpace(M, Slist, thetalist))

function my_constraint(state...)
position = [i for i in state[1:3]]
thetalist = [i for i in state[4:end]]
sum((fkin_position(thetalist) - position).^2)
end

register(model, :my_constraint, 3 + 6, my_constraint, autodiff=true)

for (i, thetalist) in enumerate(Iterators.partition(x, 6))
@NLconstraint(model, my_constraint(desired_tip_position[i],
desired_tip_position[i],
desired_tip_position[i], thetalist...) <= 1e-6)
end
``````

The above formulation works as expected. However, I am not completely satisfied with this solution; particularly, I am not happy with the parameter passing in the definition of the `@NLconstraint`.

I have thought about just passing `my_constraint(desired_tip_position[i]..., thetalist...)`, but JuMP gives returns: `JuMP supports splatting only symbols. For example, x... is ok, but (x + 1)..., [x; y]... and g(f(y)...) are not.`

I have also tried something slightly different, like passing `i` to `my_constraint` and indexing `desired_tip_position` there:

``````function my_constraint(state...)
position = desired_tip_position[Int(state)]
thetalist = [i for i in state[2:end]]
sum((fkin_position(thetalist) - position).^2)
end

register(model, :my_constraint, 1 + 6, my_constraint, autodiff=true)

for (i, thetalist) in enumerate(Iterators.partition(x, 6))
@NLconstraint(model, my_constraint(i, thetalist...) <= 1e-6)
end
``````

However, when I do that, `i` becomes a Float64 and I am not able to index an array with a float. Furthermore, trying to convert it with `Int(i)` makes `ForwardDiff.jl` unhappy as well…

Does anyone have any advice regarding a better way to accomplish this? Thanks!