I want to solve a quite large but sparse DAE system using DifferentialEquaitions.jl

I can do it with decent efficiency when using Float64’s

Now I would like to do the same thing with higher precision numbers (Double64, BigFloat… you name it) to cover some edge cases.

The thing is, that the jacobian is a `SparseMatrixCSC`

. At this point the `linsolve`

's do not work anymore since `lu`

is not specialized for `SparseMatrixCSC{BigFloat}`

.

Not sure if BlockBandedMatrices.jl is an option with such a sparsity pattern:

```
julia> c.Jac
2524×2524 SparseArrays.SparseMatrixCSC{Float64, Int64} with 21124 stored entries:
⠻⣦⡀⠀⠈⠳⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠈⠻⣦⡀⠀⠈⠳⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⢦⡀⠀⠈⠻⣦⡀⠀⠈⠳⣄⠀⠀⠀⢦⡀⠀⠀⠀⠀⠂⠂⠀⠀
⠀⠙⢦⡀⠀⠈⠻⣦⡀⠀⠈⠳⣄⠀⠀⠙⢦⡀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠙⢦⡀⠀⠈⠻⣦⡀⠀⠈⠳⣄⠀⠀⠙⢦⠀⡀⡀⢀⠀
⠀⠀⠀⠀⠀⠙⢦⡀⠀⠈⠻⣦⡀⠀⠈⠳⣄⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⠙⢦⡀⠀⠈⠻⣦⡀⠀⠈⠳⣄⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠈⠳⣄⠀⠀⠙⢦⡀⠀⠈⠻⣦⡀⠀⠈⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠈⠳⣄⠀⠀⠙⢦⡀⠀⠈⠻⣦⡀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⠀⠀⠀⠙⠂⠀⠀⠈⠻⣦⣦⡂⠐⠀
⠀⠀⠀⠀⠨⠀⠀⠀⠀⠨⠀⠀⠀⠀⠀⠀⠀⠀⠨⠻⠿⣧⣠⡀
⠀⠀⠀⠀⠀⠀⠀⠀⠀⠐⠀⠀⠀⠀⠀⠀⠀⠀⠐⠀⠀⠺⣿⣿
```

Dense fallback is not an option either since it would literally take for ever.

So I tried to use iterative solvers from IterativeSolvers.jl as they are purely implemented in Julia an thus can handle the sparse jacobians. But than I run into the next problem. Literally all solvers lead to instability warnings and the integration of the DAE stopped after one or two steps. My first thought was, that this is due to the Rosenbrock integrators which to my knowledge demand for precise jacobians and the iterative solvers simply could not cover these requirements, but other algs fail as well.

So my question now is, how to solve sparse DAE’s with higher precision numbers. I’m fighting with this for a week now and I would be thankful for any hint towards a solutions.