Enzyme with Const() on a vector throws an error

Hello all,

I would like to autodiff a function where some vector-valued arguments are kept constant. I thought to use Enzyme with the Const() functionality but run into issues when applying Const() to vectors.

The following is a minimal working example that reproduces the problem. I use
Enzyme v0.13.12 with julia 1.11

using Enzyme

# this will work
function f(x::Array{Float64}, c::Vector{Float64})
    y = (x[1]-c[1]) * (x[1]-c[1]) + (x[2]-c[2]) * (x[2]-c[2])
    return y
end;

# this won't work
function h(x::Array{Float64}, c::Vector{Float64})
    y = sum( (x-c).^2 )
    return y
end;

# this will work
function h2(x::Array{Float64}, c1::Float64, c2::Float64)
    c = [c1, c2]
    y = sum( (x-c).^2 )
    return y
end;

The three functions compute the squared norm between the two vectors x and c. For example


x = [4.0, 3.0];
c = [2.0, 1.0];

f(x, c)
f(x, c) == h(x, c) == h2(x, c[1], c[2]) # returns true

Now, autodiff on f and h2 works

dx = [0.0, 0.0]
autodiff(Reverse, f, Active, Duplicated(x, dx), Const(c));
dx
2*(x-c) == dx # true

dx = [0.0, 0.0]
autodiff(Reverse, h2, Active, Duplicated(x, dx), Const(c[1]), Const(c[2]));
dx
2*(x-c) == dx # true

However, for h I get a Constant memory is stored (or returned) to a differentiable variable. error

dx = [0.0, 0.0]
autodiff(Reverse, h, Active, Duplicated(x, dx), Const(c));
dx

I am new to Julia and its AD system and puzzled by this error. It seems to me that Const(c) did not work when c is a vector? What would I need to change to make it work? Manually expanding the vector c into scalars won’t be an option for me.

Many thanks for your help.

I can’t reproduce this on Julia 1.10.6 with Enzyme v0.13.12. Maybe you’re using an older version of Enzyme?

Enzyme on julia 1.11 has some issues still use it on Julia 1.10

1 Like

This seems to be indeed a Julia version issue. The posted code works with Julia 1.10.6, Enzyme v0.13.12 but not with Julia 1.11 (again with Enzyme v0.13.12).

The issue seems to be with the julia version (v 1.11.1) and not the Enzyme version; I was using the same as you. Things indeed work with Julia v 1.10.6. Many thanks for the response.

Yeah Enzyme still has a few things to work out for v1.11

After debugging with @gbaraldi @jameson and @Oscar_Smith, we have found a similar issue to be caused by a performance regression in 1.11.

I believe a fix is in the works (Allow taking Matrix slices without an extra allocation by vtjnash · Pull Request #56236 · JuliaLang/julia · GitHub) and will be subsequently released in a patch of julia 1.11

In particular, 1.11’s new array implementation seems to have caused issues for alias analysis. In particular, julia/base/abstractarray.jl at 9af0dea9d2c7c957bec7a14acacf0b234447be31 · JuliaLang/julia · GitHub causes issues in proving a fresh allocation doesn’t alias with other data.

It’s likely the root cause of your issue above is something similar, so hopefully that fix gets pushed into 1.11 soon.

2 Likes

A workaround has been made to the latest release which enables your case (and simple broadcasting) above to work on 1.11 without runtime activity.

Awesome, thank you for letting me know!

Sorry to revive the old thread but I came here via a google after the same problem (also new to AD in Julia). I’m on Julia v1.11.3 and the following package versions:

(julia) pkg> st
  [a0c0ee7d] DifferentiationInterface v0.6.50
  [7da242da] Enzyme v0.13.35
⌅ [f6369f11] ForwardDiff v0.10.38

Using Enzyme with Const wrapping a DataFrame gives the same error, is this expected, or is it fixed and I am somehow behind on versions (though I don’t appear to be)?

using RDatasets, DataFrames
using DifferentiationInterface
using Enzyme
using ForwardDiff

trees = dataset("datasets", "trees")

tree_mod_expectation = function(beta, data)
    return @. beta[1] * (data[!, :Girth]^beta[2]) * (data[!, :Height]^beta[3])
end

beta = [.002, 2, 1]

f = tree_mod_expectation

DifferentiationInterface.jacobian(f, AutoEnzyme(), beta, Constant(trees))
DifferentiationInterface.jacobian(f, AutoForwardDiff(), beta, Constant(trees))

The error:

julia> DifferentiationInterface.jacobian(f, AutoEnzyme(), beta, Constant(trees))
ERROR: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Runtime-Activity).
If Enzyme should be able to prove this use non-differentable, open an issue!
To work around this issue, either:
 a) rewrite this variable to not be conditionally active (fastest, but requires a code change), or
 b) set the Enzyme mode to turn on runtime activity (e.g. autodiff(set_runtime_activity(Reverse), ...) ). This will maintain correctness, but may slightly reduce performance.
Mismatched activity for:   ret {} addrspace(10)* %126, !dbg !285 const val:   %126 = load atomic {} addrspace(10)*, {} addrspace(10)* addrspace(13)* %125 unordered, align 8, !dbg !296, !tbaa !272, !alias.scope !108, !noalias !111, !enzyme_type !271
Type tree: {}
 llvalue=  %125 = getelementptr inbounds {} addrspace(10)*, {} addrspace(10)* addrspace(13)* %124, i64 %109, !dbg !296

Stacktrace:
 [1] getindex
   @ ~/.julia/packages/DataFrames/kcA9R/src/dataframe/dataframe.jl:558

Stacktrace:
  [1] lookupname
    @ ~/.julia/packages/DataFrames/kcA9R/src/other/index.jl:0 [inlined]
  [2] getindex
    @ ~/.julia/packages/DataFrames/kcA9R/src/other/index.jl:440 [inlined]
  [3] getindex
    @ ~/.julia/packages/DataFrames/kcA9R/src/dataframe/dataframe.jl:557
  [4] #11
    @ ~/Desktop/git/FunWithSplines/julia/mgcv.jl:9 [inlined]
  [5] fwddiffe3julia__11_29842wrap
    @ ~/Desktop/git/FunWithSplines/julia/mgcv.jl:0
  [6] macro expansion
    @ ~/.julia/packages/Enzyme/g1jMR/src/compiler.jl:5445 [inlined]
  [7] enzyme_call
    @ ~/.julia/packages/Enzyme/g1jMR/src/compiler.jl:4983 [inlined]
  [8] ForwardModeThunk
    @ ~/.julia/packages/Enzyme/g1jMR/src/compiler.jl:4871 [inlined]
  [9] autodiff
    @ ~/.julia/packages/Enzyme/g1jMR/src/Enzyme.jl:654 [inlined]
 [10] autodiff
    @ ~/.julia/packages/Enzyme/g1jMR/src/Enzyme.jl:524 [inlined]
 [11] macro expansion
    @ ~/.julia/packages/Enzyme/g1jMR/src/sugar.jl:726 [inlined]
 [12] #gradient#126
    @ ~/.julia/packages/Enzyme/g1jMR/src/sugar.jl:582 [inlined]
 [13] #jacobian#128
    @ ~/.julia/packages/Enzyme/g1jMR/src/sugar.jl:789 [inlined]
 [14] jacobian
    @ ~/.julia/packages/Enzyme/g1jMR/src/sugar.jl:788 [inlined]
 [15] jacobian(f::var"#11#12", prep::DifferentiationInterfaceEnzymeExt.EnzymeForwardOneArgJacobianPrep{…}, backend::AutoEnzyme{…}, x::Vector{…}, contexts::Constant{…})
    @ DifferentiationInterfaceEnzymeExt ~/.julia/packages/DifferentiationInterface/7eD1K/ext/DifferentiationInterfaceEnzymeExt/forward_onearg.jl:231
 [16] jacobian(f::var"#11#12", backend::AutoEnzyme{Nothing, Nothing}, x::Vector{Float64}, contexts::Constant{DataFrame})
    @ DifferentiationInterface ~/.julia/packages/DifferentiationInterface/7eD1K/src/fallbacks/no_prep.jl:51
 [17] top-level scope
    @ ~/Desktop/git/FunWithSplines/julia/mgcv.jl:16

Hi! Can you please try it with Enzyme’s native jacobian API for an MWE?

No problem, this reproduces the error:

using RDatasets, DataFrames
using Enzyme

trees = dataset("datasets", "trees")

tree_mod_expectation = function(beta, data)
    return @. beta[1] * (data[!, :Girth]^beta[2]) * (data[!, :Height]^beta[3])
end

beta = [.002, 2, 1]

Enzyme.jacobian(Forward, x -> tree_mod_expectation(x, trees), beta)

Per the error message saying to mark the function as constant, what if you use:

Enzyme.jacobian(Forward, Const(x -> tree_mod_expectation(x, trees)), beta)
1 Like

Alternatively (and preferably),

Enzyme.jacobian(Forward, Const(tree_mod_expectation), x, Const(trees))
1 Like

And if you’re already using DifferentiationInterface.jl in your code, you can force this function annotation by using

backend = AutoEnzyme(mode=Enzyme.Forward, function_annotation=Enzyme.Const)
1 Like

Thanks both @wsmoses and @gdalle.

My only understanding of what this constant marking is doing is from the docs here Advanced tutorial · DifferentiationInterface.jl, I did not know you can also mark an entire function with Const. Can you help point me to the docs where I can read more about what Const(tree_mod_expectation) means when applied to a function? Looking into the Enzyme docs I can only find API reference · Enzyme.jl which explains what happens when function arguments are marked as Const.

As you noticed, DifferentiationInterface.jl has an annotation system which is a bit similar to that of Enzyme.jl, but with different conventions in order to apply more or less equally to all autodiff backends. The first argument passed to the function f is always differentiated, and then you can annotate the rest with either DI.Constant (which translates to Enzyme.Const) or DI.Cache (which roughly translates to Enzyme.Duplicated).
The trouble is, Enzyme.jl also allows you to annotate the function itself, when it contains some data which may play a role in differentiation. This is such an advanced and unique aspect that I didn’t make room for it in DI’s native API. Instead, I chose to put that setting in the backend object AutoEnzyme (see its docstring). So whenever you use AutoEnzyme with function_annotation=Enzyme.Const, you’re forcing DI to annotate f as Enzyme.Const(f) before calling Enzyme.autodiff / Enzyme.jacobian / Enzyme.gradient. Does that clarify things?

2 Likes

Enzyme has no distinction between a function argument and any other argument in this sense that it can contain data to be marked differentiable or not.

You can mark any argument (or function argument) as Enzyme.Const to ask Enzyme not to differentiate with respect to it. See, for example: API reference · Enzyme.jl}}%20where%20{F,%20ReturnPrimal,%20ABI,%20ErrIfFuncWritten,%20RuntimeActivity,%20CS,%20ST,%20ty_0,%20N}

This does carry with it the semantics that if you store differentiable data into a variable marked Const, and later load from it, you may get a counter-intuitive result (see FAQ · Enzyme.jl).

Though also looking more closely at the error message, it said:

ERROR: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Runtime-Activity).
If Enzyme should be able to prove this use non-differentable, open an issue!
To work around this issue, either:
 a) rewrite this variable to not be conditionally active (fastest, but requires a code change), or
 b) set the Enzyme mode to turn on runtime activity (e.g. autodiff(set_runtime_activity(Reverse), ...) ). This will maintain correctness, but may slightly reduce performance.

which indicates the issue is not a function argument (though its possible that would also resolve).
Here Enzyme is suggesting you try wrapping the Enzyme mode inside of a runtime activity, e.g. Enzyme.autodiff(set_runtime_activity(Reverse), ...)

In our case that would be

Enzyme.jacobian(set_runtime_activity(Forward), Const(tree_mod_expectation), x, Const(trees))

All Enzyme native functions (autodiff, gradient, jacobian) take a mode, which can take parameters, like runtime activity as here.

I’d generally recommend using the Enzyme native API as it is the only one officially supported (and also the error messages actually correspond to that API xD).

2 Likes

Indeed it’s very clear, thanks!

If you want to reproduce this behavior in DI, again it goes through the backend because it is very specific to Enzyme:

backend = AutoEnzyme(mode=Enzyme.set_runtime_activity(Enzyme.Forward))

Perhaps I could add error hints to DI for these cases, pointing people to the runtime activity and function annotation options inside AutoEnzyme? That way they could figure it out on their own, or know where to look in the docs?

Error hints are additional error messages that are displayed on top of a thrown exception, like this one for when a backend isn’t loaded and triggers a MethodError:

julia> using DifferentiationInterface

julia> gradient(sum, AutoEnzyme(), [1.0])
ERROR: MethodError: no method matching _prepare_pullback_aux(::Val{…}, ::DifferentiationInterface.PullbackFast, ::typeof(sum), ::AutoEnzyme{…}, ::Vector{…}, ::Tuple{…})
The function `_prepare_pullback_aux` exists, but no method is defined for this combination of argument types.

The autodiff backend you chose requires a package which may not be loaded. Please run the following command and try again:

        import Enzyme

Closest candidates are:
  _prepare_pullback_aux(::Val, ::DifferentiationInterface.PullbackSlow, ::F, ::ADTypes.AbstractADType, ::Any, ::NTuple{N, T} where {N, T}, Context...) where {F, C}
   @ DifferentiationInterface ~/Documents/GitHub/Julia/DifferentiationInterface.jl/DifferentiationInterface/src/first_order/pullback.jl:157
  _prepare_pullback_aux(::Val, ::DifferentiationInterface.PullbackSlow, ::F, ::Any, ::ADTypes.AbstractADType, ::Any, ::NTuple{N, T} where {N, T}, ::Context...) where {F, C}
   @ DifferentiationInterface ~/Documents/GitHub/Julia/DifferentiationInterface.jl/DifferentiationInterface/src/first_order/pullback.jl:174

It’s not the same as a custom error message because I can’t control the creation of the MethodError here, so I just add something a posteriori after it has been thrown (see this file for the definition of the hint).

If there is a well-defined and public type in Enzyme for this specific error (as in, throw(EnzymeConstantDifferentiableError("blablabla")) instead of error("blablabla")), I can catch it and add a short text telling users what to try?