Using Knitro with ForwardDiff and JuMP

Hi,

I am trying to use Knitro with JuMP which contains a constraint that imposes the gradient of a function to be 0. I want to compute the gradient using ForwardDiff.

I found this can be done using Ipopt, but I cannot get it to work using Knitro. For instance, I tried to use the code in this example (@constraint with ForwardDiff.gradient - #2 by odow) using the Knitro optimizer but I got an error message that it is not supported.

Here is the MWE below:

using ForwardDiff, JuMP, KNITRO

Q = [1.65539  2.89376; 2.89376  6.51521];
q = [2; -3]
f(x) = 0.5*x'*Q*x + q'*x + exp(-1.3*x[1] + 0.3*x[2]^2)
kkt_conditions(x) = ForwardDiff.gradient(f,x)

lm = KNITRO.LMcontext()
model = Model(() -> KNITRO.Optimizer(license_manager = lm,
                                     outlev = 1))
JuMP.@variable(model, x[i=1:2])
kkt_conditions(x) = ForwardDiff.gradient(f,x)
residual_fx(_x) = kkt_conditions(_x)

JuMP.@operator(model, op_f1, 2, (x...) -> residual_fx(collect(x))[1])
JuMP.@constraint(model, op_f1(x...) == 0)
JuMP.optimize!(model)

The error I get is:

ERROR: Constraints of type MathOptInterface.ScalarNonlinearFunction-in-MathOptInterface.EqualTo{Float64} are not supported by the solver.

If you expected the solver to support your problem, you may have an error in your formulation. Otherwise, consider using a different solver.

The list of available solvers, along with the problem types they support, is available at https://jump.dev/JuMP.jl/stable/installation/#Supported-solvers.

I checked the link in the error message, but it says Knitro supports NLP too, so I am not sure how it can be adapted to work with ForwardDiff

Thanks for the help in advance!!

I expect this to work.

What is import Pkg; Pkg.status()? You might have an old version of KNITRO.jl and it just needs updating.

1 Like

Amazing – it works now. I had v0.13.0. The code works by updating to v0.14.4. Thanks!

1 Like