ForwardDiff
only works with pure julia code AFAIK, which means it won’t work with the external calls to BLAS
or LAPACK
. When I inadvertently combine them by using qr()
it doesn’t work. The question is what to do about it.
Here’s an illustration of the problem:
using Optim
using LinearAlgebra
function bam(n=5)
T = hcat(ones(n), 1:n)
QR = qr(T)
"not a sensible optimization target, but enough to show the problem"
function target(β)
sum(QR.Q * β)
end
r = optimize(target, zeros(n), Newton(); autodiff=:forward)
end
bam()
ERROR: LoadError: MethodError: no method matching lmul!(::LinearAlgebra.QRCompactWYQ{ForwardDiff.Dual{ForwardDiff.Tag{var"#target#5"{LinearAlgebra.QRCompactWY{Float64, Matrix{Float64}}}, Float64}, Float64, 5}, Matrix{ForwardDiff.Dual{ForwardDiff.Tag{var"#target#5"{LinearAlgebra.QRCompactWY{Float64, Matrix{Float64}}}, Float64}, Float64, 5}}}, ::Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#target#5"{LinearAlgebra.QRCompactWY{Float64, Matrix{Float64}}}, Float64}, Float64, 5}})
The expression QR.Q * β
is the call that triggers the error.
As a bonus, when I try to debug this in VS Code it seems to crash the language server reliably.
The code for qr includes
lmul!(A::QRCompactWYQ{T,S}, B::StridedVecOrMat{T}) where {T<:BlasFloat, S<:StridedMatrix} =
LAPACK.gemqrt!('L','N',A.factors,A.T,B)
Since it is certainly not true that ForwardDiff.Dual <: BlasFloat
, the failure to find a matching declaration is unsurprising.
One solution would be to revert to my previous, less “optimized” code, which instantiated the entire Q
matrix first and thus apparently never called the external linear algebra libraries (which seems a little odd).
Computing derivatives analytically is not at all appealing for the real problem.