Simple numerical differentiation

Hello there, I am wondering how to implement numerical differentiation in Julia simply. My MWE looks like,

using ForwardDiff
p = [1, 2, 3]
f(x::Vector) = p[1] .+ p[2] .* x .+ p[3] .* x .^ 2
x=1:10
g = x -> ForwardDiff.gradient(f, x);
g(x)

The result looks like Array{Array{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f),Float64},Float64,11},1},1}.

I wish to know a simple differentiation option, not gradient. Note that my question is not
related to analytical differentiation where SymPy or Mathematica should be used
despite simple polynomial used in my MWE. I would like to numerically differentiate
the function f. Searching this topic would result in many ODE how-to’s, not the case
for me. Any idea? Thanks in advance…

Try https://github.com/JuliaDiff/FiniteDiff.jl

It’s slightly unclear what you’re trying to do in your example. If g is a function that takes a single argument (not a vector), you might use ForwardDiff to evaluate that on each element of 1:10 like this:

using ForwardDiff
const p = [1,2,3]
f(x) = p[1] + p[2]*x + p[3]*x^2
g(x) = ForwardDiff.gradient(f, x)
map(g, 1:10)

using a finite difference package (here using FiniteDifferences.jl), it might look like this:

using FiniteDifferences
const fd = central_fdm(5,1) 
g(x) = fd(f, x) # using f from above snippet
map(g, 1:10)
3 Likes

You might be looking for ForwardDiff.gradient(sum∘f, 1:10). In general gradient(h, y) expects that h(y) is a number, and y is a vector (or some other array). (ForwardDiff.derivative is for real → real functions, and ForwardDiff.jacobian for vector → vector.)

1 Like

Note that what ForwardDiff package offers is not numerical differentiation. There is now a trio of approaches to differentiation:

  • symbolic: Sympy, …
  • numerical: FiniteDifferences, FiniteDiff, …
  • algorithmic: ForwardDiff, Yota, Zygote, ReverseDiff,…
5 Likes
using FiniteDiff
p = [1, 2, 3]
f(x::Vector) = p[1] .+ p[2] .* x .+ p[3] .* x .^ 2
x=1:10
g = x -> FiniteDiff.finite_difference_gradient(f, x);
g(x)
1 Like