ANN: JuliaSmoothOptimisers just got smoother - reverse-mode AD support

I am glad to announce that you can now use reverse-mode automatic differentiation (AD) packages such as Zygote and ReverseDiff in the JuliaSmoothOptimisers ecosystem. This is part of an effort towards an AD-based first-order augmented Lagrangian framework which is close but not there yet. Here is the syntax for defining a model that makes use of Zygote:

using ADNLPModels, LinearAlgebra, Zygote

x0 = zeros(2)
f(x) = dot(x,x)
nlp = ADNLPModel(f, x0, adbackend = ADNLPModels.ZygoteAD())
x = ones(2)
ADNLPModels.grad(nlp, x) == [2, 2]

Happy optimisation!

15 Likes