Although it’s not quite stable yet, people seem to be interested in the topic, so I think it’s time to announce a package I’ve been working on lately.
XDiff.jl is an expression differentiation package, supporting fully symbolic approach to finding tensor derivatives. Unlike automatic differentiation packages, XDiff.jl can output not only ready-to-use derivative functions but also their symbolic expressions suitable for further optimization and code generation. Here’s an example:
function ann(w1, w2, w3, x1)
_x2 = w1 * x1
x2 = log(1. + exp(_x2)) # soft RELU unit
_x3 = w2 * x2
x3 = log(1. + exp(_x3)) # soft RELU unit
x4 = sum(w3 * x3)
return 1.0 ./ (1.0 + exp(-x4)) # sigmoid output
end
# ANN input parameter types
types = (Matrix{Float64}, Matrix{Float64}, Matrix{Float64}, Vector{Float64})
# generate example input
w1, w2, w3, x1 = randn(10,10), randn(10,10), randn(1,10), randn(10)
# create a dict of symbolic derivatives
dexs = rdiff(ann, types)
dexs[:w1] # ==> quote ... end
# create derivative functions
dw1, dw2, dw3, _ = fdiff(ann, types)
dw1(randn(100,100), randn(100,100), randn(1,100), randn(100))
Another unique feature of XDiff.jl is that it can generate expressions not only for functions R^n → R, but also functions R^n → R^m using Einstein indexing notation:
ctx = [:outfmt => :ein]
dexs = rdiff(:(z = W*x + b); ctx=ctx, W=rand(3,4), x=rand(4), b=rand(3))
dexs[:W] # ==> :(dz_dW[i,m,n] = x[n] * (i == m))
Go through the whole README or install XDiff.jl directly using:
Pkg.clone("https://github.com/dfdx/XDiff.jl")