I’ve been using `SciML`

with great success for over a year now. I’m a very big fan.

I hope this doesn’t seem rude, but I have 2 propositions which would improve SciML imo.

I noticed that my main struggle is calculating derivatives efficiently. Especially when sparsity is involved. My first question is, **would SciML benefit from a DifferentiationProblem?**. Which allows choosing the type of derivative by changing a single line similar to ODEs, PDEs… . (I know of

`AbstractDifferentiation.jl`

, but it was never finished).**A problem with this** is that we don’t want to use the known `CommonSolve`

interface:

```
prob = DifferentiationProblem(func)
solve(prob, SymbolicDerivative(), inputs1...)
solve(prob, SymbolicDerivative(), inputs2...)
```

Because that requires to recalculate the symbolic function for each repeated solve for different inputs.

So i beg the second question, **Would SciML benefit from solver caches?**.

i.e. change the

`CommonSolve`

interface to something like:```
prob = Problem(...)
solver = SymbolicDerivative(prob)
solve(solver, inputs1...)
solve(solver, inputs2...)
```

Similar problems for inefficient repeated solver usage occurs in for example in `Optimization.jl`

. Where some solvers require big array preallocations in each solve, for which it would be nice if solver caches exists. (Note: caches for derivatives exist in `OptimizationFunction`

)

I know this latter remark seems nitpicky, but in most use-cases i have come accross, I want to reuse a solver many times for different initial conditions. And the custom solvers I use are relatively slow to initialise.