# Create a function of derivatives in Symbolics.jl

I have a function f(x,u;p) where x, u are both vectors of variables and p is a vector of parameters.

I would like to create a new function f_{x}(x,u;p) = \frac{\partial f}{ \partial x} and f_{xu}(x,u;p) = \frac{\partial f}{ \partial x \partial u} etc.
I don’t think I’m currently doing this well at all:

using LinearAlgebra, Symbolics;
δ=1.0; α=0.67; β=0.98; # parameters

# store parameters in a vector p
p = [δ, α, β]

# state vector to approximate around x̄
function x̄(p)
δ, α, β = p
return (((1/β) -1 +δ)/(α))^(1/(α-1))
end

# control vector to approximate around ū
function ū(p)
δ, α, β = p
s_ss = (((1/β) -1 +δ)/(α))^(1/(α-1))
return (s_ss)^(α) - δ*s_ss
end

# objective/cost/utility/return function
function f(x, u; p=p)
x1 = x
u1 = u
δ, α, β = p
return log(u1)
end

Symbolics.@variables sx su  # make symbolic variables x u
#
sj = Symbolics.jacobian([f(sx, su; p=p)], [sx, su])
f_expr = build_function(sj, [sx, su])
Df = eval(first(f_expr))

#
sj = Symbolics.hessian(f(sx, su; p=p), [sx, su])
f_expr = build_function(sj, [sx, su])
Hf = eval(first(f_expr))

# call the new functions at x=1, u=2
Df([1, 2])
Hf(1, 2])


Are there faster/better ways to do this?

Are you trying to do a DSL for perturbation solutions? If so, then you can look at one version of it in GitHub - HighDimensionalEconLab/DifferentiableStateSpaceModels.jl

This is a proof of concept for an upcoming paper which we hope to publicize sooneer than later. The code there gets a lot crazier because it needs more gradients to make the perturbation solver differentiable and there are many missing features - but it may have what you need to get started with downstream applications.

1 Like

That seems fine? I guess, what are you issues with it?

1 Like

@ChrisRackauckas it’s something I hacked together trying to figure out the Symbolics.jl docs
Just wanted to make sure there I wasn’t missing anything obvious
Would be awesome if I could create a new function in one step

f(x) = 2*x^3
df(x) = Symbolics.derivative([f(x)], [x])
# would automatically return a function df(x) = 6*x^2


@jlperla I’m not working about perturbation. I’m really excited to see what you get.
I have very strong opinions about perturbation toolkits.
I believe Oren Levintal has the best (except it’s in Matlab, not on Github etc etc)
SolveDSGE.jl is has some really nice features too (allows OccBin but only box constraints)
Dolo has some nice options for time iteration (the Python version has EGM!)
I really hope to see an Economics version of SciML
I doubt it though.
SciML gives the user fast convenient access to hundred of ODE solvers…
In a state-of-the-art interface

I don’t, except that it is best to stick with Dynare whenever possible since it is so well maintained and tested I have stronger opinions on the class of problems where we should even consider perturbation models which is substantially smaller than what people use them for.

The goal of this package is for differentiable perturbation solvers, which none of the others provide. In a dream world it wouldn’t need to exist and something like Dynare.jl would provide that for everyone to use. Similarly, the SciML interfaces for solving initial value problems are focused on being differentiable as well (e.g. state-space models and likelihoods) - though I don’t think those are something that should live in the domain-specific DSL.

There are lots of reasons to do it in multiple steps. Quite often you want to save the expressions out as a normal .jl file so you can load it up with precompilation/etc. down the road. Symbolics can be very slow.

There are also some quirky issues with world age where you can’t just insert a new function into the scope willy-nilly. See GitHub - SciML/RuntimeGeneratedFunctions.jl: Functions generated at runtime without world-age issues or overhead or GitHub - JuliaStaging/GeneralizedGenerated.jl: A generalized version of Julia generated functions @generated to allow closures in generated functions and avoid the use of runtime eval or invokelatest. for wrappers with all the caveats and working with the module of the functions.

1 Like

ehh, I’m not sure. That feels like over-reaching in complexity reduction. We need to be able to compute symbolic derivative expressions of course, and then we have tools for building functions. Doing two function calls like this isn’t not a major interface difference than doing it in one shorter function call, but if we make the simpler interface here, then we’d have two ways to do the same basic thing. If that’s ever done, it needs to have a higher payoff, like changing 10 lines of code to 1 or something.

1 Like