Hello,
Given a matrix A \in \mathbb{R}^{m\times n}, my goal is to recover the SVD decomposition of A from the eigendecomposition of A^\top A and A^\top A.
A compact SVD decomposition of matrix A \in \mathbb{R}^{m\times n} gives:
A = U_{svd}\Sigma V_{svd}^\top, in which \Sigma is a r \times r diagonal matrix where r = min\{m, n\}. U_{svd} \in \mathbb{R}^{m \times r} and V_{svd} \in \mathbb{R}^{n \times r} verify U_{svd} U_{svd}^\top = V_{svd} V_{svd}^\top = I_r.
A is not directly accessible, I only know C_x = A^\top A \in \mathbb{R}^{m \times m} and
C_y = AA^\top \in \mathbb{R}^{n \times n}.
From the spectral theorem, we know that:
C_x = V \Lambda_x V^\top, C_y = U \Lambda_y U^\top.
If we assume that m \leq n, the eigenvalues \Lambda_x equate the first r values of \Lambda_y, which are the square of the singular values \Sigma.
However, we have a sign ambiguity in the columns of the different eigenvectors of C_x and C_y.
Therefore, U\sqrt{\Lambda_y}V[:,1:r]^\top does not give the SVD decomposition of A! The sign of the columns of U are V are not chosen such that AV = U and A^\top U = V, as in the SVD decomposition (Singular value decomposition - Wikipedia).
Is there a way to flip the sign of the columns of U and V to recover the SVD decomposition of A?
A simple example:
A norm of 2 for one of the columns means that they are opposite of each other.
m = 10
n = 20
r = min(m, n)
A = randn(m, n)
Cx = A'*A
Cy = A*A'
Ex = eigen(Cx; sortby = λ -> -λ)
Ey = eigen(Cy; sortby = λ -> -λ)
svdA = svd(A)
@show norm(svdA.S - sqrt.(Ex.values[1:r]))
@show norm(svdA.S - sqrt.(Ey.values[1:r]))
@show norm.(eachcol(Ex.vectors[:,1:r]-svdA.V))
@show norm.(eachcol(Ey.vectors-svdA.U));