Note Enzyme has seen some pretty major improvements since January.

Enzyme v0.11 fixed GC support, dynamic dispatch handling, a rule system, and added linear algebra support. The linear algebra is done via fallbacks to differentiating the kernels, which would be better handled with a high level rule to continue using BLAS, but it at least works.

As an example, here’s a fairly dynamic code that works fine:

```
using Enzyme
A = Any[2.0 3.0
2.0 4.0]
function f(x::Array{Float64}, y::Array{Float64})
y[1] = (A * x)[1]
return nothing
end
x = [2.0, 2.0]
bx = [0.0, 0.0]
y = [0.0]
by = [1.0];
Enzyme.autodiff(Reverse, f, Duplicated(x, bx), Duplicated(y, by)); # Works fine!
```

Forward and reverse. And now on main (unreleased), here’s an example showing it’s working with a globally defined Lux neural network (values compared to Zygote):

```
using Enzyme
x = [2.0, 2.0]
bx = [0.0, 0.0]
y = [0.0,0.0]
using ComponentArrays, Lux, Random
rng = Random.default_rng()
Random.seed!(rng,100)
dudt2 = Lux.Chain(x -> x.^3,
Lux.Dense(2, 50, tanh),
Lux.Dense(50, 2))
p, st = Lux.setup(rng, dudt2)
function f(x::Array{Float64}, y::Array{Float64})
y .= dudt2(x, p, st)[1]
return nothing
end
Enzyme.autodiff(Reverse, f, Duplicated(x, bx), Duplicated(y, ones(2)))
function f2(x::Array{Float64})
dudt2(x, p, st)[1]
end
using Zygote
bx2 = Zygote.pullback(f2, x)[2](ones(2))[1]
bx
@show bx - bx2
#=
2-element Vector{Float64}:
-9.992007221626409e-16
-1.7763568394002505e-15
=#
```

It’s of course not perfect yet and since the rule system has just landed it needs people to start writing some rules (especially for things like NNLib kernels for full Flux support), but IMO it passed many of its major usability milestones and now needs the community to start helping it get the required rules.

And the forward mode is very robust from what I can tell. I haven’t had any issues I’ve ran into with it, other than it’s not clear how to do the equivalent of PreallocationTools.jl