Hello!

I am looking for the best package to compute a numerical Jacobian.

Best,

Juan

Hello!

I am looking for the best package to compute a numerical Jacobian.

Best,

Juan

1 Like

For π = g(π), `CoupledFields.gradvecfield([a b], X, Y, kernelpars)`

returns π gradient matrices, for π random points in π.

For parameters [π π]: π is a smoothness parameter, and π is a ridge parameter

```
using CoupledFields
g(x,y,z) = x .* exp.(-x.^2 - y.^2 - z.^2)
X = -2 .+ 4*rand(100, 3)
Y = g.(X[:,1], X[:,2], X[:,3])
kernelp.ars = GaussianKP(X)
βg = gradvecfield([0.5 -7], X, Y[:,1:1], kernelpars)
```

Also `CoupledFields`

doesnβt require a closed-form function, it can be used if you only have the observed fields π and π.

Your best bet is to use ForwardDiff.jl which uses automatic differentiation

```
julia> using ForwardDiff
julia> ForwardDiff.jacobian(x->exp.(x), rand(2))
2Γ2 Array{Float64,2}:
2.33583 0.0
0.0 1.59114
```

1 Like

Yeah thatβs what Iβve been trying to use. However it seems to me that if I have a function that does not have a closed form ForwardDiff does not worj. Or at least it is not working for me.

Obtener Outlook para iOS

You could also use DiffEqDiffTools.jl to calculate the jacobian via finite differencing.

Also if you post your function and the error message we could figure out why ForwardDiff isnβt working.

1 Like

FiniteDIfferences.jl to calculate via fiinite differencing.

https://www.juliadiff.org/FiniteDifferences.jl/latest/pages/api/#FiniteDifferences.jacobian

2 Likes

@jmcastro2109: when you asked via e-mail you also mentioned that you are using these for indirect inference. The problem with that is that for discrete-choice problems, derivatives may not be easy to approximate using (finite) samples, so I would consider using a derivative-free optimization method.

2 Likes