Is it possible to do any better? Right now I’m only interested in matrix of size at most 6x6, but it would be nice to have a version of the function that is also optimized for larger sizes.
Finding the largest eigenvalue/eigenvector (in absolute value) using the power method is fast. If this eigenvalue, t is positive, substract t*I from the matrix, and find the largest eigenvalue again with the power method. If the matrix was positive definite, it should be smaller than the previous t.
The logic of this is pretty simple considering eigenvalues are real. As noted, numerical issues make this determination delicate unless there is a good gap between 0 and minimal eigenvalue.
Just to set expectations, in the presence of numerical errors, the most you can expect to do is to find if A is close to a matrix that is positive semidefinite. If you did have an exactly positive semidefinite matrix, as soon as you do any computations, at best you get results with a small backward error, so you just get results for a matrix near A and are quite likely to see small negative eigenvalues. So in practice you might want to make sure there are no negative eigenvalues of magnitude much larger than the unit roundoff (or much larger than whatever errors might have already been in your matrix.)
Pivoted Cholesky can be used to compute a factorization of a semidefinite matrix, and if that works, you know that the matrix was close to being semidefinite. But it’s a little tricky to use. It bails out at the first negative pivot and isn’t stable on indefinite matrices. I don’t think you can conclude much from that last pivot computed if it’s possible that your matrix was really indefinite, although if the factorization closely approximates A, you certainly know that A is close to semidefinite.
I think it’s simpler to interpret the results if you get all pivots in a way that is stable and reasonably rank revealing on general hermitian matrices so that you have complete information about the inertia of your matrix. A fairly reliable and fast way to do that is LDL^T with rook pivoting. The bunchkaufman function has an option for rook pivoting:
So in this case I got a factorization of a matrix that turned out to be indefinite, but just barely.
Rook pivoting has a decent degree of reliability in revealing rank and is robust enough that A is close to semidefinite if and only if all nonnegligible diagonal elements of D are positive, where it’s up to you to define “negligible”. If A is positive definite by some larger margin, you won’t get any negligible elements. It’s also possible to get 2\times 2 blocks on the diagonal of D. The inertia includes the signs of the eigenvalues of those blocks. So there is some potential inconvenience in handling 2\times 2 blocks, but it would be significantly faster than computing eigenvalues for larger matrices. I don’t know about something as small as 6\times 6 however.