How to pass 'ADNLPModels.ReverseDiffAD() as the adbackend keyword argument to the ADNLPModel or ADNLSModel constructor.'

Hi all, I’m trying to minimize a function which has constraint as nonlinear vector function. input of constraint function is vector and output of constraint function is a vector of large size ( more than 500 elements )

‘’'julia

N=51;
cons1(x::Vector{Float64})=f123(x) # Constraint function

x0=vcat([xguess;yguess;[tfinal]]); # Initial guess vector
obj1(x)=x[1]+x[3]; # Objective function

xlower=push!(-3*ones(2*N),pi);
xupper=push!(3*ones(2*N),1.5*pi);

consLower=-ones(4*N)*Inf
consUpper=zeros(4*N)

model=ADNLPModel(obj1, x0, xlower, xupper, cons1, consLower, consUpper)

println("cx = $(cons(model, model.meta.x0))")
# println("Jx = $(jac(model, model.meta.x0))")

output=ipopt(model)
xstar=output.solution
fstar=output.objective

‘’’
I want to change default option of ADNLPModel → “ForwardDiff” to “ReverseDiff”. As suggested here . Thank you.

Hi. ReverseDiff is an optional dependency. The ReverseDiff backend only becomes available after you have imported ReverseDiff. Is that what you were asking?

I imported it. I don’t know how to change the default option of ADNLPMode .
l want ADNLPMode to use ReverseDiff instead of ForwardDiff. Kindly provide my the syntaxes

There are several examples in the tests, e.g., ADNLPModels.jl/basic.jl at main · JuliaSmoothOptimizers/ADNLPModels.jl · GitHub.

Define your model as you did, import ReverseDiff, and add the argument backend = ReverseDiffAD.

Thanks a lot, sorry for the trouble caused. Next time i will go through the document throughly.

No trouble! Don’t hesitate.

1 Like

@dpo, I have go through the link which you provided. I’m getting error, I have already installed the packages of Zygote and ReverseDiff

‘’’ julia

using ReverseDiff
using Zygote

using ADNLPModels
using Test

ADNLPModels.ADNLPModel

function test_autodiff_backend_error()
  @testset "Error without loading package - $adbackend" for adbackend in (ZygoteAD, ReverseDiffAD)
    adbackend = if adbackend == ZygoteAD
      eval(adbackend)(0, 0)
    else
      eval(adbackend){Nothing}(0, 0, nothing)
    end
    @test_throws ArgumentError gradient(adbackend, sum, [1.0])
    @test_throws ArgumentError gradient!(adbackend, [1.0], sum, [1.0])
  end
end

test_autodiff_backend_error()

‘’’

error message looks like
‘’’ julia

ERROR: UndefVarError: ZygoteAD not defined
Stacktrace:
 [1] macro expansion
   @ /opt/julia-1.7.2/share/julia/stdlib/v1.7/Test/src/Test.jl:1378 [inlined]
 [2] test_autodiff_backend_error()
   @ Main ~/Navneeth_Research/Julia_Navneth_Code/testing1.jl:11
 [3] top-level scope
   @ ~/Navneeth_Research/Julia_Navneth_Code/testing1.jl:25

‘’’

I’m running my julia code in VScode. Any guidance is highly appreciated.

Sorry, you probably have to be using the master branch. We’re preparing a new release now.

I’m very new to julia, if you don’t mind kindly ,show how to fix this issue.

Hi @Gummala_Navneeth ,
As @dpo said we rework the constructor for ADNLPModel to make the change of backend more smooth.

using Pkg
Pkg.update()

should update ADNLPModels to the version 0.3.2 (you can check with Pkg.status())

Then, you can follow what is in the updated doc here Reference · ADNLPModels.jl

using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
c(x) = [1x[1] + x[2]; x[2]]
nvar, ncon = 3, 2
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.

using ReverseDiff
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ReverseDiffAD)

using Zygote
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ZygoteAD)

Thanks a lot for this