Enzyme.jl Critical Issues with Basic Arithmetic Operations in Automatic Differentiation

According to the Enzyme.jl documentation the “Enzyme differentiates arbitrary multivariate vector functions as the most general case in automatic differentiation”, however I encountered critical errors (including the kernel crash) just by using the basic arithmetic operations.

Here’s a minimal working example (MWE):

using Enzyme

foo1(x, y) = x[1] * y[1] + x[2] * y[2]
foo2(x, y) = x[1] * y[1] + x[2] * y[2] + 1.0
foo3(x, y) = x' * y
foo4(x, y) = x' * y + 1.0
foo5(x, y) = x .* y
foo6(x, y) = sum(x .* y)
foo7(x, y) = x + y
foo8(x, y) = sum(x + y)
foo9(x, y) = sum(foo7(x, y))
foo10(x, y) = [x[1] + y[1], x[2] + y[2]]
foo11(x, y) = x .+ y
foo12(x, y) = x .+ y .+ 1.0
foo13(x, y) = sum(foo12(x, y))

function test_func(func)
    x = [2.0, 3.0]
    y = [4.0, 1.0]

    grad_reverse_fail = nothing
    grad_forward_fail = nothing
    try 
        Enzyme.gradient(Reverse, func, x, y)
    catch e
        grad_reverse_fail = e
    end

    try 
        Enzyme.gradient(Forward, func, x, y)
    catch e
        grad_forward_fail = e
    end

    if grad_reverse_fail !== nothing
        println("Reverse gradient failed for $(func): ", grad_reverse_fail)
    else
        println("Reverse gradient succeeded for $(func)")
    end

    if grad_forward_fail !== nothing
        println("Forward gradient failed for $(func): ", grad_forward_fail)
    else
        println("Forward gradient succeeded for $(func)")
    end
end

test_func(foo1) # foo1 works
test_func(foo2) # foo2 works
test_func(foo3) # foo3 works
test_func(foo4) # foo4 works
test_func(foo5) # Reverse fails: Enzyme mutability error; Forward works
test_func(foo6) # foo6 works
test_func(foo7) # Reverse fails: Enzyme mutability error; Forward works
test_func(foo8) # foo8 works
test_func(foo9) # foo9 works
test_func(foo10) # Reverse fails: Enzyme mutability error; Forward works
test_func(foo11) # Reverse fails; Forward works
test_func(foo13) # Reverse fails; Forward fails: EnzymeRuntimeActivityError
test_func(foo12) # foo12 last, as it fails catastrophically and crashes the kernel

Interesting examples:
foo7 and foo9, where foo9 calls foo7 and suddenly the differentiation works again (presumably the problem is that the function returns more than one argument; for context documentation says that Enzyme should work on any f: R^n → R^m)

foo12 and foo13, where foo12 terminates the julia kernel, while foo13 (which is calling foo12) fails, but does not crash the kernel.

Why are these simple examples failing? What can I realistically expect from Enzyme with these functionalities? I’m considering Enzyme for sensitivity analysis of differential equations - should I proceed with this package, or is it currently unreliable for such applications?

Thanks for your insights!

1 Like

In foo5 and foo7 you are mutating the memory. You need to use Duplicated in Enzyme to pass a mirror/shadow/not sure the right word of x and y. The same goes for your functions with . in them.

1 Like

Enzyme.gradient assumes a scalar (not vector return), Home · Enzyme.jl

"Key convenience functions for common derivative computations are gradient (and its inplace variant gradient!). Like autodiff, the mode (forward or reverse) is determined by the first argument.

The functions gradient and gradient! compute the gradient of function with vector input and scalar return."

You should likely use Enzyme.jacobian for functions which return a vector, Home · Enzyme.jl

I have tried suggestions proposed by you (using Duplicated and jacobian) and I have also looked again through examples in documentation. Unfortunately, I still experience a lot of errors, which seem to be very unpredictable.

using Enzyme

foo1(x, y) = x[1] * y[1] + x[2] * y[2]
foo5(x, y) = x .* y
foo12(x, y) = x .+ y .+ 1.0
foo14(x, y, z) = begin
    z .= x .* y .+ 1.0
    return nothing
end

x = [2.0, 3.0]
y = [4.0, 1.0]

Enzyme.jacobian(Forward, foo1, x, y) # works
Enzyme.jacobian(Reverse, foo1, x, y) # fails, ERROR: MethodError: no method matching jacobian

dx, dy = zeros(size(x)), zeros(size(y))
Enzyme.autodiff(Forward, foo1, Duplicated(x, dx), Duplicated(y, dy)) # does not fail, but produces wrong result
println(dx, dy)

dx, dy = zeros(size(x)), zeros(size(y))
Enzyme.autodiff(Reverse, foo1, Duplicated(x, dx), Duplicated(y, dy)) # works
println(dx, dy)

Enzyme.jacobian(Forward, foo5, x, y) # works, but with warning, Warning: TODO forward zero-set of memorycopy used memset rather than runtime type
Enzyme.jacobian(Reverse, foo5, x, y) # fails, ERROR: MethodError: no method matching jacobian

dx, dy = zeros(size(x)), zeros(size(y))
Enzyme.autodiff(Forward, foo5, Duplicated(x, dx), Duplicated(y, dy)) # does not fail, but produces wrong result
println(dx, dy)

dx, dy = zeros(size(x)), zeros(size(y))
Enzyme.autodiff(Reverse, foo5, Duplicated(x, dx), Duplicated(y, dy)) # fails, ERROR: Duplicated Returns not yet handled
println(dx, dy)

z = zeros(size(x))
dx, dy, dz = zeros(size(x)), zeros(size(y)), ones(size(z))
Enzyme.autodiff(Forward, foo14, Duplicated(x, dx), Duplicated(y, dy), Duplicated(z, dz)) # does not fail, but produces wrong result
println(dx, dy, dz)

z = zeros(size(x))
dx, dy, dz = zeros(size(x)), zeros(size(y)), ones(size(z))
Enzyme.autodiff(Reverse, foo14, Duplicated(x, dx), Duplicated(y, dy), Duplicated(z, dz)) # works
println(dx, dy, dz)

Enzyme.jacobian(Reverse, foo12, x, y) # fails, ERROR: MethodError: no method matching jacobian
Enzyme.jacobian(Forward, foo12, x, y) # crashes the kernel
  1. What is the correct way to compute derivatives for such functions in both modes?
  2. Why Forward and Reverse produce different results even if they are called in the same way?
1 Like

Most of these errors are due to the way you’re (mis)using the Enzyme.jl API.

Reverse-mode jacobian does not support several arguments.

Forward-mode autodiff requires you to specify input perturbations that are propagated to the output. Here the input perturbations are all zero, so no derivatives are propagated.

See Billy’s remark above, for vector returns you need to reformulate your function f(x) = y as f!(y, x) = nothing.


If you are struggling with the Enzyme.jl API, you may find it useful to try DifferentiationInterface.jl as a starting point. It is much less powerful and only supports a single active argument, but it hides the complexity of handling activity annotations (Duplicated and friends) from the user. In particular, DI.pushforward and DI.pullback are useful to compute JVPs and VJPs respectively.

2 Likes

To get your mind around the way forward and reverse modes work in Enzyme, this discussion might help: