How to use SciMLOperators.jl with autograd in DifferentialEquations.jl

Hello,

I have a system that is implemented as a FunctionOperator from SciMLOperators.jl, that I would like to integrate as a differential equation. For example:

using SciMLOperators, DifferentialEquations

op = FunctionOperator((du,u,p,t) -> du .= u .* u, rand(10), t=0.0, p=[0.0], batch=true)
op = cache_operator(op, rand(10))

prob = ODEProblem{true}(op, rand(10), (0.0,1.0), [0.0])

solve(prob, Rosenbrock23())

The code above defines the operator, and a related ODE problem, and runs it in in-place mode with an integrator that uses automatic differentiation. However, this doesn’t work, presumably because the cache is a Float64 type instead of dual numbers. How can I make it work correctly? Any suggestions would be greatly appreciated!

SciMLOperators needs more work

Good to know, thanks!