Autodiff : Custom derivatives of vector functions for linear algebra

Flux and other autodiff packages most definitely let you define gradients of vector valued functions. You wouldn’t get very far without them, many of the functions you want to differentiate are vector or higher order tensor valued.

Take a look at the examples in https://github.com/FluxML/Flux.jl/blob/master/docs/src/internals/tracker.md#custom-gradients. The minus and multiply functions whose grad are defined return general tensors/arrays. In particular, the \delta in those definitions is not restricted to being a scalar.

2 Likes