Best method for (scalar) second derivative with ForwardDiff

What is the best/most efficient method to find the first two derivatives of a scalar function with ForwardDiff?

This old post suggests

ForwardDiff.derivative(x -> ForwardDiff.derivative(f, x), 1.0)

to compute the second derivative but that sounds inefficient, thinking about the algebraic structure of the computation. I’d like to get the first derivative as well using the DiffResults API, and this looks like a mess to do in this format with nested calls.

Alternatively, I can wrap everything in a vector and use

ForwardDiff.hessian(x -> f(x[1]), [x])

but I am not sure about the impact of extra vector allocations.

Is there a better builtin method already implemented in the package? The source of DiffResults suggests that it supports higher derivatives natively, but I don’t see anything corresponding in ForwardDiff, so maybe the API is there but not the implementation.

Thanks in advance for any help!

I haven’t delved into this in a bit. I believe you’re looking to call ForwardDiff.hessian! with a DiffResults.HessianResult object to compute multiple derivatives simultaneously.

If the scalar/vector distinction is causing issues, you can always use SVector(x) from the StaticArrays package to create a 1-element vector (or small-ish vectors, in general) with virtually no overhead. These play well with these autodiff packages.

julia> using DiffResults, ForwardDiff, StaticArrays

julia> x = SVector(1.0)
1-element SVector{1, Float64} with indices SOneTo(1):

julia> res = DiffResults.HessianResult(x)
ImmutableDiffResult(1.0, ([1.0], [0.0;;]))

julia> ForwardDiff.hessian!(res,z->sin(only(z)),x)
ImmutableDiffResult(0.8414709848078965, ([0.5403023058681398], [-0.8414709848078965;;]))

Note that I used sin(only(z)) instead of sin(z) to extract the one element from the vector input.

1 Like

Many thanks; this works and it seems like a good solution!

In the meantime I have also found an open issue on Github from 2017 for an API to compute higher scalar derivatives, so I assume there is still nothing implemented natively in ForwardDiff.

What do you think ForwardDiff.hessian does? It works in exactly the same way, by computing the Jacobian of the gradient.

For higher order derivatives like this, you may want to try TaylorDiff.jl which has optimizations for this kind of case.

It’s built on the same underpinning as ForwardDiff.jl with optimizations for how it goes to higher order.