Thanks for the link, it is very useful, and I’m starting to implementing it in my full code.
During that, I faced some issues, so I decided to analyze the simple code here. I noticed that the following code (which is a revision of the first example I wrote) doesn’t give an error, but returns nothing
for the gradient.
using LinearAlgebra
using OrdinaryDiffEq
using SciMLOperators
using Zygote
using SciMLSensitivity
const T = ComplexF64
const N = 10
const u0 = ones(T, N)
H_tmp = rand(T, N, N)
const H = H_tmp + H_tmp'
function my_f(γ)
U = MatrixOperator(-1im * H - γ * Diagonal(H))
tspan = (0.0, 1.0)
prob = ODEProblem{true}(U, u0, tspan)
sol = solve(prob, Tsit5())
return real(sol.u[end][end])
end
my_f(1.9) # -0.041501631765322705
gradient(my_f, 1.9)
┌ Warning: Potential performance improvement omitted. EnzymeVJP tried and failed in the automated AD choice algorithm. To show the stack trace, set SciMLSensitivity.STACKTRACE_WITH_VJPWARN[] = true. To turn off this printing, add `verbose = false` to the `solve` call.
└ @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/XCu1T/src/concrete_solve.jl:24
┌ Warning: Potential performance improvement omitted. ReverseDiffVJP tried and failed in the automated AD choice algorithm. To show the stack trace, set SciMLSensitivity.STACKTRACE_WITH_VJPWARN[] = true. To turn off this printing, add `verbose = false` to the `solve` call.
└ @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/XCu1T/src/concrete_solve.jl:67
┌ Warning: Reverse-Mode AD VJP choices all failed. Falling back to numerical VJPs
└ @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/XCu1T/src/concrete_solve.jl:207
(nothing,)
First of all, it is strange to see those warnings related to Enzyme, while I’m using Zygote. Then, it returns nothing
. I thought that the problem could be related to the fact that the ODE is parameter free, so I decided to implement it in a different way
coef(a, u, p, t) = - p[1]
function my_f(γ)
# U = MatrixOperator(-1im * H - γ * Diagonal(H))
U = MatrixOperator(-1im * H) + ScalarOperator(one(T), coef) * MatrixOperator(Diagonal(H))
tspan = (0.0, 1.0)
prob = ODEProblem{true}(U, u0, tspan, [γ])
sol = solve(prob, Tsit5())
return real(sol.u[end][end])
end
my_f(1.9) # -0.04150163176532281
gradient(my_f, 1.9)
┌ Warning: Using fallback BLAS replacements for (["zgemv_64_"]), performance may be degraded
└ @ Enzyme.Compiler ~/.julia/packages/GPUCompiler/GnbhK/src/utils.jl:59
┌ Warning: Potential performance improvement omitted. ReverseDiffVJP tried and failed in the automated AD choice algorithm. To show the stack trace, set SciMLSensitivity.STACKTRACE_WITH_VJPWARN[] = true. To turn off this printing, add `verbose = false` to the `solve` call.
└ @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/XCu1T/src/concrete_solve.jl:67
┌ Warning: Reverse-Mode AD VJP choices all failed. Falling back to numerical VJPs
└ @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/XCu1T/src/concrete_solve.jl:207
ERROR: InexactError: Float64(0.010428536019754344 + 0.048873021588869255im)
Stacktrace:
[1] Real
@ ./complex.jl:44 [inlined]
[2] convert
@ ./number.jl:7 [inlined]
[3] setindex!
@ ./array.jl:976 [inlined]
[4] setindex!
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/adjtrans.jl:335 [inlined]
[5] _setindex!
@ ./abstractarray.jl:1436 [inlined]
[6] setindex!
@ ./abstractarray.jl:1413 [inlined]
[7] _modify!
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/generic.jl:91 [inlined]
[8] _generic_matmatmul!(C::Adjoint{…}, A::Adjoint{…}, B::Matrix{…}, _add::LinearAlgebra.MulAddMul{…})
@ LinearAlgebra ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/matmul.jl:923
[9] generic_matmatmul!
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/matmul.jl:868 [inlined]
[10] _mul!
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/matmul.jl:287 [inlined]
[11] mul!
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/matmul.jl:285 [inlined]
[12] mul!
@ ~/.julia/juliaup/julia-1.11.1+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/matmul.jl:253 [inlined]
[13] vec_pjac!(out::Vector{…}, λ::Vector{…}, y::Vector{…}, t::Float64, S::SciMLSensitivity.GaussIntegrand{…})
@ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/XCu1T/src/gauss_adjoint.jl:469
It returns an error, together with the previous warnings.
I would like to point out that I may usually need the first case, which is much simpler. Of course the second one is also ok.