I am posting this question here, because AutoGrad.jl is associated with Knet.
I have a derivative of the
svd returns an object of type
SVD, with components
S. I want to register the derivative with
AutoGrad such that a scalar function of any or all these subcomponents can be differentiated with for instance
@diff. Currently, I have not wrapped the derivatives in a composite type, but return a Tuple of tensors. I have successfully registered the derivative with
@primitive svd(x) dsvd(x)
but I can’t test it with
@diff because it doesn’t do non-scalar functions (that I have found). I must be missing something though, because the following does differentiate:
h(x) = (x,x.^2, x.^3) hh(x) = ( a = h(x); sum(a+a)) X = Param([3.0]) y = @diff hh(X) # T(30.0) grad(y,X) # 28.0
AutoGrad computed the jacobian of the Tuple.
Any hints on how I should declare the
dsvd so it can be used to differentiate functions of