Numerical Jacobian

Hello!

I am looking for the best package to compute a numerical Jacobian.

Best,
Juan

1 Like

For π‘Œ = g(𝑋), CoupledFields.gradvecfield([a b], X, Y, kernelpars) returns 𝑛 gradient matrices, for 𝑛 random points in 𝑋.
For parameters [π‘Ž 𝑏]: π‘Ž is a smoothness parameter, and 𝑏 is a ridge parameter

using CoupledFields
g(x,y,z) = x .* exp.(-x.^2 - y.^2 - z.^2)
X = -2 .+ 4*rand(100, 3)
Y = g.(X[:,1], X[:,2], X[:,3])

 kernelp.ars = GaussianKP(X)
 βˆ‡g = gradvecfield([0.5 -7], X, Y[:,1:1], kernelpars)

Also CoupledFields doesn’t require a closed-form function, it can be used if you only have the observed fields 𝑋 and π‘Œ.

Your best bet is to use ForwardDiff.jl which uses automatic differentiation

julia> using ForwardDiff

julia> ForwardDiff.jacobian(x->exp.(x), rand(2))
2Γ—2 Array{Float64,2}:
 2.33583  0.0
 0.0      1.59114
1 Like

Yeah that’s what I’ve been trying to use. However it seems to me that if I have a function that does not have a closed form ForwardDiff does not worj. Or at least it is not working for me.

Obtener Outlook para iOS

You could also use DiffEqDiffTools.jl to calculate the jacobian via finite differencing.

Also if you post your function and the error message we could figure out why ForwardDiff isn’t working.

1 Like

FiniteDIfferences.jl to calculate via fiinite differencing.

https://www.juliadiff.org/FiniteDifferences.jl/latest/pages/api/#FiniteDifferences.jacobian

2 Likes

@jmcastro2109: when you asked via e-mail you also mentioned that you are using these for indirect inference. The problem with that is that for discrete-choice problems, derivatives may not be easy to approximate using (finite) samples, so I would consider using a derivative-free optimization method.

2 Likes