Is there a solution for automatic differentiation through code that contains a number of linear algebra operations including, for example, lyap and the matrix exponential exponential (exp)? This type of linear algebra occurs in probabilistic programming of a likelihood based on a Kalman filter type of computation.

The issue with algebra functions is that they rely on FORTRAN (BLAS, LAPACK). There are some options for pure Julia linear algebra but they are quite limited in scope. It seems that the way forward for these functions is a custom derivative. For one, this eliminates the issue of autodiff through FORTRAN code. Furthermore, the result could run faster, and potentially also be more accurate.

One solution is to define a custom Jacobian. This would be great and would help me to move forward. However, potentially there is a better solution based on the FrĂ©chet derivative.

Some examples can illustrate these points using the matrix inverse `inv`

. For a 40x40 matrix, I can compute the Jacobian in 80 milliseseconds using:

```
using LinearAlgebra
using ForwardDiff
n = 40
A = randn(rng,n,n)
ForwardDiff.jacobian(inv,A)
# 79.573 ms (2015 allocations: 84.73 MiB)
```

Using the custom Jacobian:

```
function jacobian_inv(A)
Ainv = inv(A)
-kron(Ainv',Ainv)
end
```

The same Jacobian is computed in 7 milliseconds, a 10x speed-up:

```
jacobian_inv(A)
# 7.074 ms (10 allocations: 39.10 MiB)
```

However, the approach based on the FrĂ©chet derivative can be more computationally efficient, as well as easier to formulate. In this approach, the derivative is not a number or a matrix, but a linear function. The chain rule is not a chain of multiplications but a chain of linear functions. The benefit is that in practice it may happen that we only want to evaluate this linear function at one given value, reducing computational load. In this sense, derivatives for vector-valued functions are quite different from scalar derivatives. As an illustration, consider the following function `ftest`

:

```
n = 40
A = randn(rng,n,n)
B = randn(rng,n,n)
C = randn(rng,n,n)
ftest(x) = (C/(A+x[1]*B))[end,end] # = (C*inv(A+x[1]*B))[end,end]
```

The FrĂ©chet derivative for the matrix inverse is given by:

```
function frechet_inv(A,Î)
A = lu(A)
-(A\Î/A)
end
```

We can now perform differentiation of `ftest`

using:

```
xtest = 1
(C*frechet_inv(A+xtest*B,B))[end,end]
# 68.750 ÎŒs (11 allocations: 101.52 KiB)
```

which runs in 69 ÎŒs, 100x faster than the custom Jacobian! (ForwardDiff takes 141 ÎŒs for this function).

To summarize: Is there an existing package with which I can formulate a custom Jacobian, or, better yet, a custom FrĂ©chet derivative? Besides supporting autodiff for linear algebra, this would also enable autodiff more generally for black box vector-valued functions. In absence of this, is there a developing package where I can bring up this request? Perhaps capstan is an option, although this is very early in its development cycle.