Hi All,

Thanks in advance for any help and suggestions. I’m new to Julia (and JuMP) so go easy.

Is there a way to use `ForwardDiff.gradient` within a nonlinear constraint in JuMP?

An example is

``````# Solve using JuMP interface to Ipopt
import JuMP # v1.15.1
import Ipopt
import ForwardDiff as FD

# Suggested by compiler, didn't work
# FD.can_dual(::Type{JuMP.AffExpr}) = true

### Avoid modification ###
Q = [1.65539  2.89376; 2.89376  6.51521];
q = [2;-3]

f(x) = 0.5*x'*Q*x + q'*x + exp(-1.3*x[1] + 0.3*x[2]^2)
function kkt_conditions(x)
end
residual_fx(_x) = kkt_conditions(_x)

# Initial condition
x0 = [-0.9512129986081451, 0.8061342694354091]
######

model = JuMP.Model(Ipopt.Optimizer)

# Helpers suggested in https://jump.dev/JuMP.jl/stable/manual/nonlinear/
my_fun(x...) = residual_fx(collect(x))
JuMP.@operator(model, op_f, 5, (x...) -> residual_fx(collect(x)))

# Solver setup
JuMP.@variable(model, x[i=1:2], start = x0[i])

## Tried
# JuMP.@constraint(model, my_fun(x...) .== 0) # MethodError: Cannot `convert` an object of type JuMP.AffExpr to an object of type JuMP.VariableRef
# JuMP.@constraint(model, op_f(x...) == 0) # MethodError: no method matching length(::JuMP.NonlinearExpr)

# Ideally this would be the best syntax
JuMP.@constraint(model, residual_fx(x) .== 0) # MethodError: Cannot `convert` an object of type JuMP.AffExpr to an object of type JuMP.VariableRef

# Solve
JuMP.optimize!(model)

print("Residual: ", residual_fx(JuMP.value.(x)))
``````

This results in

``````MethodError: Cannot `convert` an object of type JuMP.AffExpr to an object of type JuMP.VariableRef

Closest candidates are:
convert(::Type{T}, ::T) where T
@ Base Base.jl:64
(::Type{JuMP.GenericVariableRef{T}} where T)(::Any, ::Any)
@ JuMP ~/.julia/packages/JuMP/ToPd2/src/variables.jl:243

Stacktrace:
[1] cvt1
@ ./essentials.jl:418 [inlined]
[2] ntuple
@ ./ntuple.jl:49 [inlined]
[3] convert(#unused#::Type{Tuple{JuMP.VariableRef, JuMP.VariableRef}}, x::Tuple{JuMP.AffExpr, JuMP.AffExpr})
@ Base ./essentials.jl:419
[4] ForwardDiff.Partials{2, JuMP.VariableRef}(values::Tuple{JuMP.AffExpr, JuMP.AffExpr})
@ ForwardDiff ~/.julia/packages/ForwardDiff/PcZ48/src/partials.jl:2
[5] convert(#unused#::Type{ForwardDiff.Partials{2, JuMP.VariableRef}}, partials::ForwardDiff.Partials{2, JuMP.AffExpr})
@ ForwardDiff ~/.julia/packages/ForwardDiff/PcZ48/src/partials.jl:74
[6] cvt1
@ ./essentials.jl:418 [inlined]
[7] ntuple(f::Base.var"#cvt1#1"{Tuple{ForwardDiff.Partials{2, JuMP.VariableRef}, ForwardDiff.Partials{2, JuMP.VariableRef}}, Tuple{ForwardDiff.Partials{2, JuMP.AffExpr}, ForwardDiff.Partials{2, JuMP.AffExpr}}}, #unused#::Val{2})
@ Base ./ntuple.jl:49
[8] convert
@ ./essentials.jl:419 [inlined]
@ ~/.julia/packages/ForwardDiff/PcZ48/src/config.jl:98 [inlined]
[10] ForwardDiff.GradientConfig(f::typeof(f), x::Vector{JuMP.VariableRef}, ::ForwardDiff.Chunk{2}, ::ForwardDiff.Tag{typeof(f), JuMP.VariableRef})
@ ForwardDiff ~/.julia/packages/ForwardDiff/PcZ48/src/config.jl:123
@ ForwardDiff ~/.julia/packages/ForwardDiff/PcZ48/src/config.jl:121
@ ForwardDiff ~/.julia/packages/ForwardDiff/PcZ48/src/config.jl:121
[14] kkt_conditions
@ ./In[41]:19 [inlined]
[15] residual_fx(_x::Vector{JuMP.VariableRef})
@ Main ./In[56]:17
[16] macro expansion
@ ~/.julia/packages/MutableArithmetics/K9YPJ/src/rewrite.jl:321 [inlined]
[17] macro expansion
@ ~/.julia/packages/JuMP/ToPd2/src/macros.jl:717 [inlined]
[18] top-level scope
@ In[56]:37
``````

Many of the use cases I am looking at have 3rd party provided vectorised constraint functions. So any solutions that minimally change the user provided constraints would be amazing, if it’s possible.

1 Like

Hi @justinberi, welcome to the forum.

You’re running into a couple of issues:

• After registering an operator, you need to use `op_f` instead of `residual_fx`
• You need to get the input number of arguments correct
• You need to splat the input to `op_f(x...)`

But those are relatively minor. A bigger block is that operators must return a scalar, whereas yours returns a vector. One solution is:

``````import JuMP
import Ipopt
import ForwardDiff as FD
Q = [1.65539  2.89376; 2.89376  6.51521];
q = [2; -3]
f(x) = 0.5*x'*Q*x + q'*x + exp(-1.3*x[1] + 0.3*x[2]^2)
residual_fx(_x) = kkt_conditions(_x)
x0 = [-0.9512129986081451, 0.8061342694354091]
model = JuMP.Model(Ipopt.Optimizer)
JuMP.@operator(model, op_f1, 2, (x...) -> residual_fx(collect(x))[1])
JuMP.@operator(model, op_f2, 2, (x...) -> residual_fx(collect(x))[2])
JuMP.@variable(model, x[i=1:2], start = x0[i])
JuMP.@constraint(model, op_f1(x...) == 0)
JuMP.@constraint(model, op_f2(x...) == 0)
JuMP.optimize!(model)
``````

If performance is an issue, you can implement this work-around:

1 Like

Hey @odow, thanks for your help and the effort you are putting into a great community and package!

That solves the issue I was having.

Inevitably it brought up more questions but I will ask them in separate topics and link here.

1 Like
1 Like