What is the "interface" necessary for DifferentialEquations.jl if your data type does not subclass AbstractArray?

There are a few similar conversation from previous years in this forum (e.g. JuliaDiffEq with custom types - #24 by mzilhao), but I was not able to find official documentation on the topic, and I am uncertain how much of this information is out of date. Implementing the interface suggested in older conversations does not currently work for me (it seems vec or safevec might be required, which confuses me?).

Hence my question: For a completely arbitrary type (that permits 1D indexing and has an eltype of some reasonable scalar type, but does not subclass AbstractArray), what is the officially required interface for it to work with DifferentialEquations.jl? Or are we advised to not use completely custom objects?

It depends on the solver. With OrdinaryDiffEq, you just need a valid broadcast. Here’s a tested type which doesn’t even have indexing and works with non-stiff solvers in OrdinaryDiffEq:

For implicit methods you need linear algebraic operations.

2 Likes

This seems to work for in-place: false problems, but when I try to set up an in-place: true problem, I get the following error:

LoadError: MethodError: no method matching recursivecopy!(::MyType{,,,}, ::MyType{...})

Does this mean that my package has to depend on RecursiveArrayTools so that I can define a recursivecopy! method? Or is there a way to define this interface without importing extra packages?

You probably will need RecursiveArrayTools because it defines some things that really all packages need. The definition of recursivecopy! could probably move to ArrayInterface.jl and then to Base.

1 Like

Now I have the weirdest problem with “ghost” allocations for this new type that implements this interface.

An in-place:false problem makes fewer allocations than an in-place:true version of the same problem!? This is not a typo, it should indeed be the opposite. And it does not seem to be an issue with broadcasting mis-implementation, as broadcasting on its own does not cause allocations.

Debugging this has been a mess, as each of the functions in perform_step! do not allocate when tested with @time, but @time perform_step! shows a very large number of allocations.

perform_step!(i, c::Tsit5Cache) is being used for the in-place operations. Testing each of its statements by hand (of the form integrator.f(du, u) and u .+ a .* k) do not allocate. By testing the perform_step! as a whole, it allocates. But then, profiling with julia --track-allocation=user (or =all) does not show allocations. Is there some caveats I should be aware of when using --track-allocation? The perform_step/*.jl.mem files showed very minimal allocation, not what the @time or @btime macros show in REPL (it was not allocations due to compilation).

I think that’s a general problem with allocation tracking. --track-allocation should be correct IIRC.