`ForwardDiff` and `LAPACK` don't get along

ForwardDiff only works with pure julia code AFAIK, which means it won’t work with the external calls to BLAS or LAPACK. When I inadvertently combine them by using qr() it doesn’t work. The question is what to do about it.

Here’s an illustration of the problem:

using Optim
using LinearAlgebra

function bam(n=5)
    T = hcat(ones(n), 1:n)
    QR = qr(T)
    "not a sensible optimization target, but enough to show the problem"
    function target(β)
        sum(QR.Q * β)
    r = optimize(target, zeros(n), Newton(); autodiff=:forward)


ERROR: LoadError: MethodError: no method matching lmul!(::LinearAlgebra.QRCompactWYQ{ForwardDiff.Dual{ForwardDiff.Tag{var"#target#5"{LinearAlgebra.QRCompactWY{Float64, Matrix{Float64}}}, Float64}, Float64, 5}, Matrix{ForwardDiff.Dual{ForwardDiff.Tag{var"#target#5"{LinearAlgebra.QRCompactWY{Float64, Matrix{Float64}}}, Float64}, Float64, 5}}}, ::Vector{ForwardDiff.Dual{ForwardDiff.Tag{var"#target#5"{LinearAlgebra.QRCompactWY{Float64, Matrix{Float64}}}, Float64}, Float64, 5}})

The expression QR.Q * β is the call that triggers the error.

As a bonus, when I try to debug this in VS Code it seems to crash the language server reliably.

The code for qr includes

lmul!(A::QRCompactWYQ{T,S}, B::StridedVecOrMat{T}) where {T<:BlasFloat, S<:StridedMatrix} =

Since it is certainly not true that ForwardDiff.Dual <: BlasFloat, the failure to find a matching declaration is unsurprising.

One solution would be to revert to my previous, less “optimized” code, which instantiated the entire Q matrix first and thus apparently never called the external linear algebra libraries (which seems a little odd).

Computing derivatives analytically is not at all appealing for the real problem.

Another solution: use QR = qr(T, Val(true)) which avoids the call to LAPACK. Since my fake target function is unbounded, the optimization bails out, but Optim and qr are working together OK.

The rules for qr also state that if the element type of the argument is not a BLAS type it will construct a different type of object, which might also work. There’s probably some way to convert them to ForwardDiff.Dual.

Probably something similar explains the fact that my previous matrix multiplication code was not using external libraries: the types involved were not BLAS types, but there was a fallback implementation in pure julia.

You could do that for part of the code that fails, and possibly contribute a rule, see


and the tutorial section on matrix rules.