Hey,
I am currently trying to use Enzyme’s gradient function to get the gradient of a quantum mechanical expectation value in respect to some parameters which go into the Hamiltonian.
What causes a problem? AD with Enzyme if my function uses matrix exponentials.
Maybe some math helps:
My Hamiltonian H(\vec{x}) (a matrix) depends on parameters \vec{x}.
The time evolution is then given by U(\vec{x}) = e^{- i H (\vec{x}) t}
Finally the expectation value of observales O_j (constant matrices) after time evolution is then given by E(\vec{x}) = \text{Tr} [ O_j U(\vec{x}) \rho U^{\dagger}(\vec{x}) ] , where \rho is just my density matrix (constant again)
At the end I just want the gradient of the expectation value (\nabla_{\vec{x}} E(\vec{x})) via automatic differentiation with Enzyme.
I tried Base.exp()
and functions from ExponentialUtilities.jl
, KrylovKit.jl
, ExponentialAction.jl
without any succes.
MWE for my expectation value:
using Enzyme, LinearAlgebra
function expect(x::Vector{Float64})
ρ = 1/2*[1 0; 0 1] #density matrix describing my system
H = [x[1] 0;0 -x[1]] + [0 x[2];-x[2] 0] #Hamiltonian
U = exp(-1im*H) #time evolution operator, THIS ONE CAUSES ME PROBLEMS
O = [1 0; 0 -1] #some observable
ρ_evolved = U*ρ*U'
return real(tr(O*ρ_evolved)) #expectation value
end
and simply calling
res = Enzyme.gradient(Reverse, x -> expect(x), rand(2))
causes the error message
┌ Warning: Using fallback BLAS replacements for ([“zherk_64_”, “zgemm_64_”]), performance may be degraded
└ @ Enzyme.Compiler ~/.julia/packages/GPUCompiler/GnbhK/src/utils.jl:59
ERROR: LoadError:
No augmented forward pass found for ejlstr$zgebal_64_$libblastrampoline.so.5
at context: call void @“ejlstr$zgebal_64_$libblastrampoline.so.5”(i8* nonnull %10, i8* nonnull %13, i64 %94, i8* nonnull %16, i64 %95, i64 %96, i64 %97, i64 %98, i64 1) #244 [ “jl_roots”({} addrspace(10)* null, { i8*, {} addrspace(10)* } %93, {} addrspace(10)* null, {} addrspace(10)* null, {} addrspace(10)* null, { i8*, {} addrspace(10)* } %90, {} addrspace(10)* null, {} addrspace(10)* null) ], !dbg !298
Stacktrace:
[1] gebal!
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/lapack.jl:243
Stacktrace:
[1] gebal!
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/lapack.jl:243
[2] exp!
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/dense.jl:682
[3] exp
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/dense.jl:622
[4] expect
@ ~/Documents/test.jl:5
[5] #1
@ ~/Documents/test.jl:11 [inlined]
[6] diffejulia__1_7124wrap
@ ~/Documents/test.jl:0
[7] macro expansion
@ ~/.julia/packages/Enzyme/RTS5U/src/compiler.jl:8398 [inlined]
[8] enzyme_call
@ ~/.julia/packages/Enzyme/RTS5U/src/compiler.jl:7950 [inlined]
[9] CombinedAdjointThunk
@ ~/.julia/packages/Enzyme/RTS5U/src/compiler.jl:7723 [inlined]
[10] autodiff
@ ~/.julia/packages/Enzyme/RTS5U/src/Enzyme.jl:491 [inlined]
[11] autodiff
@ ~/.julia/packages/Enzyme/RTS5U/src/Enzyme.jl:512 [inlined]
[12] macro expansion
@ ~/.julia/packages/Enzyme/RTS5U/src/Enzyme.jl:1719 [inlined]
[13] gradient(::ReverseMode{false, false, FFIABI, false, false}, ::var"#1#2", ::Vector{Float64})
@ Enzyme ~/.julia/packages/Enzyme/RTS5U/src/Enzyme.jl:1660
[14] top-level scope
@ ~/Documents/test.jl:11
Versions:
- Julia:
v1.11.1
- LinearAlgebra:
v1.11.0
- Enzyme:
v0.13.16
I already have a working implementation with Zygote. Why do i want to swtich to Enzyme?: To get rid of allocations. Since Zygote doesn’t allow mutation, i cannot use preallocated buffers and I need to rely on A*B
for matrix multiplication instead of mul!(C, A, B)
.
My backround: More or less only physics and maths and I’d call myself a newbie regarding AD.
Any input is appreciated a lot.
Apologies for my previous post, I panicked as I accidentally posted the issue without correctly formatting code.