GTPSA.jl is a new Julia package that provides a full-featured interface to the Generalised Truncated Power Series Algebra library for computing Taylor expansions, or Truncated Power Series (TPSs) of real and complex multivariable functions to arbitrary orders.
Truncated Power Series Algebra (TPSA) performs forward-mode automatic differentation (AD) similar to the dual-number implementation as in ForwardDiff.jl. However, instead of nesting derivatives for higher orders, TPSA naturally extends to arbitrary orders by directly using the power series expansions. This, paired with a highly optimized monomial indexing function and storage for propagating the partial derivatives, makes GTPSA.jl significantly faster for 2nd-order calculations and above, and have similar performance at 1st-order. In our example, GTPSA was x3.3 faster than ForwardDiff to 2nd order, and x18.5 faster to 3rd order.
GTPSA provides several advantages over current Julia AD packages:
Speed: GTPSA.jl is significantly faster than ForwardDiff.jl for 2nd-order calculations and above, and has very similar performance at 1st-order
Custom Orders in Individual Variables: Other packages use a single maximum order for all variables. With GTPSA, the maximum order can be set differently for individual variables, as well as for a separate part of the monomial. For example, computing the Taylor expansion of f(x_1,x_2) to 1st order in x_1 and 3rd order in x_2 is possible
Complex Numbers: GTPSA.jl natively supports complex numbers and allows for mixing of complex and real truncated power series
Distinction Between State Variables and Parameters: Distinguishing between dependent variables and parameters in the solution of a differential equation expressed as a power series in the dependent variables/parameters can be advantageous in analysis
To read more about the inner-workings of the GTPSA C library, advanced users are referred to this paper, written by the creators of GTPSA.
As weād like to prepare for a v1.0 release, we welcome any comments or criticisms of the package interface and documentation!
Because of the power of the GTPSA library, our long-term goal is to rewrite the library fully in Julia. If you are interested in this effort, feel free to reach out or add to the current issue on Github.
How does it compare to TaylorDiff.jl ? It seems like the same kind of ideas (leverage Faa di Bruno formula), but TaylorDiff does that at Juliaās compile time.
For multivariable functions, TaylorDiff.jl only seems to give you the directional derivatives, whereas GTPSA.jl will give you all of the higher-order partial derivatives. Because of this, I havenāt been able to come up with an apples-to-apples benchmark to compare the two.
Also, GTPSA does much more than just use the power series directly for calculating higher orders; the library plays a lot of tricks in both storing and propagating efficiently the many monomial coefficients, making it especially powerful for calculating derivatives w.r.t. to many variables to high orders.
Congrats on the release! Do you think it might make sense to add GPTSA.jl bindings to DifferentiationInterface.jl?
I didnāt add any for TaylorDiff.jl because it does not support things like gradient or Hessian.
I just tried to add this as a benchmark in our example, however Diffractor.jl is returning an incorrect Jacobian. The benchmark uses a function m: \mathbb{R} ^{58} \rightarrow \mathbb{R}^6:
The Jacobian calculated with GTPSA.jl agrees exactly with ForwardDiff.jl. Therefore since Diffractor.jl is not calculating the correct Jacobian, it is probably not appropriate to add it as a benchmark yet. I will submit this as an issue to Diffractor.jl, though.
I havenāt heard of this library, but I would be really interested to see how GTPSA compares. Iāll keep an eye on GitHub - TimSiebert1/ADOLC.jl, and once it is partially usable I can look into creating a benchmark.
Absolutely! We definitely want to use GTPSA.jl for AD with Juliaās many already-available optimizers, and this seems like an important first step. @nsajko already opened an issue about this on the GTPSA Github, and I see in the documentation some steps on how to do this. Iāll take a look and try to make sense of how to implement this interface, and will probably reach out on the issues with questions.
Note that for ForwardDiff.jl to be optimally efficient, you need to provide it with a precomputed config object. DifferentiationInterface.jl allows that easily with the backend-agnostic āpreparationā mechanism.
The only interface it has is TaylorDiff.derivative, see the API reference. Of course I could reconstruct everything else from there, I just didnāt take the time, and there might be smarter ways to do it from within TaylorDiff
One interesting case for a taylor mode AD is how it handles nesting.
The goal is to avoid nested forward-mode and use taylor mode instead.
But you canāt away do so easily at top level, because of encapsulation ā some function is called in the middle of code you are ADing and that function happens to internally call AD.
In Diffractor, we have a flag (era-mode) that transforms nested forwards mode into taylor forward mode where it is encountered.
I think it is possible to do this with operator overloading AD.
In DifferentiationInterface.jl, we have the internal DifferentiationInterface.nested which lets the autodiff backend know it is inside another autodiff operator.