Hello, I’m trying to use the ForwardDiff.jl package for automatic differentiation of my function, which includes using the backslash `\`

operator for a matrix solve. ForwardDiff requires Arrays to be of the more general type Real (instead of Float64). However, the `\`

solver always outputs an Array{Float64}.

Example code:

```
# input v
v = [1. 2. 3.]
# true X (would be the target in an optimization)
Xtrue = [1.1, 1.9, 0.8]
function f(v)
# in A*X = B, solve for X
# A is #3x3 square matrix, with values depending on input v
A = Matrix{Real}([1. 2. v[1]; 6. v[2] 4.; 8. 7. v[3]])
B = Vector{Real}([5., 13., 23.]) #3x1 matrix
# solve for X using backslash operator
X = A \ B #3x1 vector
# difference between calculated X and true X is the cost
cost = sum((X - Xtrue).^2)
return cost
end
using ForwardDiff
ForwardDiff.gradient(f, v)
```

ForwardDiff.gradient gives this error:

`MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{typeof(f), Float64}, Float64, 3})`

I think it’s because A \ B outputs an Array{Float64} to X.

Is there a way to make this work?