Hi @justinberi, welcome to the forum.
You’re running into a couple of issues:
- After registering an operator, you need to use
op_f
instead ofresidual_fx
- You need to get the input number of arguments correct
- You need to splat the input to
op_f(x...)
But those are relatively minor. A bigger block is that operators must return a scalar, whereas yours returns a vector. One solution is:
import JuMP
import Ipopt
import ForwardDiff as FD
Q = [1.65539 2.89376; 2.89376 6.51521];
q = [2; -3]
f(x) = 0.5*x'*Q*x + q'*x + exp(-1.3*x[1] + 0.3*x[2]^2)
kkt_conditions(x) = FD.gradient(f,x)
residual_fx(_x) = kkt_conditions(_x)
x0 = [-0.9512129986081451, 0.8061342694354091]
model = JuMP.Model(Ipopt.Optimizer)
JuMP.@operator(model, op_f1, 2, (x...) -> residual_fx(collect(x))[1])
JuMP.@operator(model, op_f2, 2, (x...) -> residual_fx(collect(x))[2])
JuMP.@variable(model, x[i=1:2], start = x0[i])
JuMP.@constraint(model, op_f1(x...) == 0)
JuMP.@constraint(model, op_f2(x...) == 0)
JuMP.optimize!(model)
If performance is an issue, you can implement this work-around: