You can just use ishermitian(A)
. And after that I would set H = Hermitian(A)
and use H
instead of A
(so that Julia will know to use more efficient algorithms).
And normally you would not use @assert
for this — you should assume that @assert
is something that might be removed in production code, and it should only be used to check for bugs. Throw an exception instead, e.g.
ishermitian(A) || throw(ArgumentError("matrix must be Hermitian"))
It is more idiomatic to simply do if isposdef(A)
(or isposdef(H)
if you use the Hermitian wrapper).
This allocates two arrays ([x, y]
is one array, and [x, y] .< 0
is another array), just to check if two numbers are negative. Putting everything into an array, even if it is just two scalars, is kind of a Matlab anti-pattern. Just check if x < 0 && y < 0
.
I would do the opposite: if you want to check whether a matrix is negative definite, it’s probably more efficient to call isposdef(-A)
or isposdef(-H)
than to look at eigenvalues. The reason is that you can check positive-definiteness using only a Cholesky factorization, which is much cheaper than computing eigenvalues.
For example, with a 1000 \times 1000 negative-definite matrix:
julia> A = randn(1000,1000); A = -A'A; H = Hermitian(A);
julia> @btime isposdef($H) || isposdef(-$H);
5.998 ms (6 allocations: 22.89 MiB)
julia> @btime eigvals($H);
42.406 ms (11 allocations: 7.99 MiB)
julia> @btime (eigmin($H), eigmax($H));
67.561 ms (22 allocations: 15.96 MiB)
julia> @btime eigmin($H);
31.952 ms (11 allocations: 7.98 MiB)
Notice that checking both isposdef(H)
and isposdef(-H)
is 7x faster than computing eigvals(H)
.
No (at least not for a Hermitian matrix). Notice that for my example above, eigmin(H)
is about 25% faster than eigvals(H)
. However, it is still slower to call both eigmin
and eigmax
, since they don’t share any computations.
In principle, it should be possible to compute both eigmin
and eigmax
more efficiently than computing all the eigenvalues, but you need to share the Hessenberg/tridiagonal reduction calculation (since that is the first step of any eigenvalue calculation, and is a big part of the cost):
function eigminmax(H::Hermitian)
T = hessenberg(H).H # real SymTridiagonal reduction
return eigmin(T), eigmax(T)
end
which gives
julia> @btime eigminmax($H);
32.574 ms (65 allocations: 8.27 MiB)
which is almost twice as fast as calling (eigmin(H), eigmax(H))
, about the same speed as eigmin(H)
, and 25% faster than eigvals(H)
. However, it is still much slower than isposdef(H)
(which computes no eigenvalues at all).