# Absolute value of a matrix

Let A be a symmetric (or Hermitian) matrix. I want to compute abs(A). By this I mean taking the absolute value of the eigenvalues. See for instance linear algebra - Understanding the absolute value of a matrix. - Mathematics Stack Exchange.

I was surprised that Base.abs(A) does not work. I implemented a few alternatives here:

absmat1(A::AbstractMatrix) = sqrt(A'A)
absmat2(A::AbstractMatrix) = sqrt(Hermitian(A'A))
function absmat3(A::AbstractMatrix)
F = eigen(A)
return F.vectors * Diagonal(abs.(F.values)) * F.vectors'
end


Some benchmarks (in the style of Fast diag(A' * B * A) - #6 by Mason):

let n = 1000
_A = randn(n, n)
A = _A + _A'
B = Hermitian(A)
for f ∈ (absmat1, absmat2, absmat3)
print(f, "   ")
@btime $f($A)
print(f, " (sym)    ")
@btime $f($B)
end
end;
# absmat1     133.304 ms (21 allocations: 46.14 MiB)
# absmat1 (sym)      139.904 ms (23 allocations: 53.77 MiB)
# absmat2     132.386 ms (21 allocations: 46.14 MiB)
# absmat2 (sym)      139.972 ms (23 allocations: 53.77 MiB)
# absmat3     147.802 ms (20 allocations: 38.51 MiB)
# absmat3 (sym)      147.689 ms (18 allocations: 38.51 MiB)


Is there a faster way?

I would also suggest abs(A) be added to Base.

I would look into SVD functions in LinearAlgebra

“ The singular values are the absolute values of the eigenvalues of a normal matrix A , because the spectral theorem can be applied to obtain unitary diagonalization of as . Therefore, .”

The first two methods would not stable in general if the matrix has singular values of widely varying magnitudes. You narrowed the focus to Hermitian matrices, but this is defined more generally and absmat3 is numerically fine for Hermitian matrices, but it is wrong for general square matrices.

More generally, \sqrt{A^* A} is the semidefinite factor H in the polar factorization A=UH where U is unitary and can be obtained from the SVD:

function absmat4(A::AbstractMatrix)
F = svd(A)
return F.Vt' * Diagonal(F.S) * F.Vt
end


That is not faster, but works for any square matrix. You could then have

function absmat3(A::Hermitian{<:AbstractMatrix})
F = eigen(A)
return F.vectors * Diagonal(abs.(F.values)) * F.vectors'
end


I don’t think you will do much better in terms of methods based on the standard matrix decompositions. But there are iterative methods for U which could then give you H=U^* A. A Newton iteration for computing U takes the form

X_{k+1} = \frac{1}{2} (X_k + X_k^{-*}), \qquad X_0 = A.

X_k is quadratically convergent to U. There also appears to be a package for this already: PolarFact.jl. The README provides some references.

3 Likes

Great answer. A (non iterative) polar factorization is also available here GitHub - JuliaLinearAlgebra/MatrixFactorizations.jl: A Julia package to contain non-standard matrix factorizations

2 Likes