The DFBDF solver is great; how can I learn more about it?

I love the DFBDF solver for implicit differential equations, 2 to 4 times faster than IDA from Sundials for my problem and using only half of the memory… And very stable.

Is there any scientific paper about it?

And where is the source code? It should be in GitHub - SciML/OrdinaryDiffEq.jl: High performance ordinary differential equation (ODE) and differential-algebraic equation (DAE) solvers, including neural ordinary differential equations (neural ODEs) and scientific machine learning (SciML) , but there are so many files that refer to this type that I get confused… Is there perhaps a pull request with the initial commit of this solver?
UPDATE: This seams to be the initial PR: Implementation of paper "Solving 0 = F (t; y(t); y0(t)) in Matlab" by JunpengGao233 · Pull Request #1452 · SciML/OrdinaryDiffEq.jl · GitHub

This could be a simple benchmark comparing IDA and DFBDF: KiteModels.jl/examples/bench.jl at main · ufechner7/KiteModels.jl · GitHub


I assume this issue is related: Fully Implicit BDF · Issue #105 · SciML/OrdinaryDiffEq.jl · GitHub

It contains a link to a pdf document, but the link is dead…
@ChrisRackauckas Do you still have this document somewhere?

Nice background info: Backward differentiation formula - Wikipedia

1 Like

It is this paper: Solving 0 = F(t, y(t), y′(t)) in Matlab

But you have to pay 30 EUR or have access via your University…

Internet archive seems to have it: Wayback Machine

1 Like

scihub also has it:

1 Like

Does anybody know what the letters of the name DFBDF stand for? I assume FBDF means Fractional Backwords Differenciation Formula, but what does the first letter D stand for?