The title essentially says it all: If I create a hermitian matrix A with
julia> R = randn(10,10)
julia> A = R * R' # definitely hermitian and positive semidefinite
and another arbitrary matrix B
julia> B = rand(7, 10)
then
julia> ishermitian(B * A * B')
false
even though, mathematically, it should be hermitian. I suspect the problem is simply some issues with floating point arithmetic. Is there a fast method that guarantees that the result is hermitian?
Since B * A * B' is initially one function call to *, it’s possible that if A::Hermitian, this could propagate? I think it needs to check that both sides are the same B though, so it can’t be type-stable, and is probably not a great idea:
julia> A = Hermitian(R * R');
julia> @less B * A * B' # 3-matrix method, calls _tri_matmul(A,B,C) ... on Julia >= 1.7?
julia> Base.:*(A::AbstractMatrix, B::Hermitian, C::Adjoint{<:Any, <:AbstractMatrix}) = A === parent(C) ? Hermitian(A * (B * C)) : LinearAlgebra._tri_matmul(A,B,C)
julia> B * A * B'
7×7 Hermitian{Float64, Matrix{Float64}}:
15.6641 14.2528 22.1946 8.88175 19.089 12.2722 14.4311
14.2528 29.7732 33.362 14.4554 22.016 19.0204 19.188
...
julia> @code_warntype B * A * B'
...
Body::Union{Hermitian{Float64, Matrix{Float64}}, Matrix{Float64}}
I’m sure this has been brought up in the past, but shouldn’t ishermitian have optional tolerance cutoffs?
Internally in code you might want to check if a matrix is approximately Hermitian and if so wrap it in Hermitian to call specialized algorithms. Though I guess you can do it yourself with isapprox(M, M'; ...).
I suppose just wrapping it in Hermitian(B * A * B') is easiest for now. If I get around to it (and need the extra performance) it might be advantageous to write an extra function that does this tri-product hermitiansandwich(A, B) = ... that has hermiticity baked in.
In most cases, in your own code you should know whether a matrix is supposed to be Hermitian in exact arithmetic. If users are passing you arbitrary matrices, it’s reasonable to ask them to indicate this by using the Hermitian type if possible.
The challenge with using isapprox(A, A', rtol=???) to decide whether you can use some Hermitian-specialized algorithm is that you then need to do some careful error analysis to decide what tolerance to use.
Yes, I suppose that’s true. We have use cases in tensor networks where we reduce a complicated set of operations (for example, a series of tensor contractions, eigendecomositions, etc.) to a matrix which is supposed to be Hermitian but may not be in practice due to numerical errors or coding errors. So it generalizes B * A * B' to cases where it is much less obvious that the resulting matrix ends up Hermitian. But again, these cases can be checked manually with isapprox and doesn’t need to be inside ishermitian.
I wrote a package a while back to try to treat these questions in a systematic way: IsApprox.jl
julia> using IsApprox
julia> R = randn(10,10); A = R * R';
julia> B = rand(7, 10);
julia> M = B * A * B';
julia> IsApprox.ishermitian(M)
false
julia> IsApprox.ishermitian(M, Equal())
false
julia> IsApprox.ishermitian(M, Approx())
true