ANN: JuliaSmoothOptimisers just got smoother - reverse-mode AD support

I am glad to announce that you can now use reverse-mode automatic differentiation (AD) packages such as Zygote and ReverseDiff in the JuliaSmoothOptimisers ecosystem. This is part of an effort towards an AD-based first-order augmented Lagrangian framework which is close but not there yet. Here is the syntax for defining a model that makes use of Zygote:

using ADNLPModels, LinearAlgebra, Zygote

x0 = zeros(2)
f(x) = dot(x,x)
nlp = ADNLPModel(f, x0, adbackend = ADNLPModels.ZygoteAD())
x = ones(2)
ADNLPModels.grad(nlp, x) == [2, 2]

Happy optimisation!

15 Likes

I get the error below when I try to run this example

UndefVarError: ZygoteAD not defined

The syntax changed multiple times since the announcement. You basically have to do something like this now which is a little more verbose:

ZygoteAD() = ADNLPModels.ADModelBackend(
  ADNLPModels.ZygoteADGradient(),
  ADNLPModels.ForwardDiffADHvprod(),
  ADNLPModels.ZygoteADJprod(),
  ADNLPModels.ZygoteADJtprod(),
  ADNLPModels.ZygoteADJacobian(0),
  ADNLPModels.ZygoteADHessian(0),
  ADNLPModels.ForwardDiffADGHjvprod(),
)
nlp = ADNLPModel(f, x0)
set_adbackend!(nlp, ZygoteAD())

Hi @richinex

The syntax has changed a bit so that ADNLPModel becomes more flexible.
You can have a look at the documentation here: Backend · ADNLPModels.jl

The following should work:

using ADNLPModels, LinearAlgebra, Zygote

x0 = zeros(2)
f(x) = dot(x,x)
nlp = ADNLPModel(f, x0, gradient_backend = ADNLPModels.ZygoteADGradient)
x = ones(2)
ADNLPModels.grad(nlp, x) == [2, 2]
1 Like

I realised my initial suggestion doesn’t actually end up using Zygote, so I made an edit. But I would use @tmigot’s suggestion anyways since he is a core dev of the package and I am not.

1 Like

Cool. Thanks