Registering jacobians of composite types in AutoGrad.jl

I am posting this question here, because AutoGrad.jl is associated with Knet.

I have a derivative of the svd.
svd returns an object of type SVD, with components U, Vt, and S. I want to register the derivative with AutoGrad such that a scalar function of any or all these subcomponents can be differentiated with for instance @diff. Currently, I have not wrapped the derivatives in a composite type, but return a Tuple of tensors. I have successfully registered the derivative with

@primitive svd(x) dsvd(x)

but I can’t test it with @diff because it doesn’t do non-scalar functions (that I have found). I must be missing something though, because the following does differentiate:

h(x) = (x,x.^2, x.^3)
hh(x) = ( a = h(x); sum(a[1]+a[3]))
X = Param([3.0])
y = @diff hh(X) # T(30.0)
grad(y,X) # 28.0

Clearly, AutoGrad computed the jacobian of the Tuple.
Any hints on how I should declare the dsvd so it can be used to differentiate functions of svd?

1 Like

Autograd used to support svd but the functionality got lost in the revamp for julia 1.0.
I just opened this issue for your problem https://github.com/denizyuret/Knet.jl/issues/403
but unfortunately I don’t know how to handle composite types

1 Like

I wasn’t aware that svd and qr had been supported. It would be great to reenable them in Autograd. Thanks for opening the issue.