I’m trying to find a way to efficiently compute the product of a transposed vector with a Jacobian matrix. So given a vector function f and two large vectors v and x of same length, I want to compute v^T * jacobian(f)(x) without explicitly computing the jacobian matrix. Does anyone know how this can be done? Apparently it can be done in some way similar to the jacobian-vector product jacobian(f)(x)*v, but I’m unable to make the connection.
PS.: for the jacobian-vector product, the way to do it is shown in a paper called “Fast Exact Multiplication by the Hessian”, showing that jacobian(f)(x)v = d/dr f(x+rv)|r=0. Below my simple implementation of it, but as mentionned I need the other way around v^T * jacobian(f)(x).
using ForwardDiff # Compute jacobian(f)(x)*v without building a jacobian function jvp(f::Function, x::AbstractVector, v::AbstractVector) g = r->f(x+r*v) return ForwardDiff.derivative(g, 0.0) end # I actually want v'*jacobian(f)(x) vjp(f::Function, x::AbstractVector, v::AbstractVector) = ??