Hello, I was curious about my Julia code spitting out negative variance for some variations in sandwich form vcov matrices (in Fixed Effects, clustering etc.).
Let’s say my matrix of control variables, X.
I found out that (X’X)^(-1) spits out positive eigenvalues for all variables, but still show false for isposdef(inv(X’ * X)).
How could this be possible? As far as I know, eigenvalues being positive is if and only if a relation with positive definiteness of the matrix if the matrix is symmetric.
Can someone help what’s going on here?

ps. I found out that the inverse of the matrix is not always symmetric in Julia. Is there a typical way to fix this?

The problem is that isposdef also requires the matrix to be Hermitian. Your matrix X’*X is indeed Hermitian (real-symmetric), but because of roundoff errors its inverse is not:

You can fix this by telling Julia that the matrix is Hermitian, so that this property is enforced through matrix inversion etcetera, just by wrapping it in the Hermitian type.

julia> H = Hermitian(A);
julia> ishermitian(H^-1)
true
julia> isposdef(H^-1)
true

PS. The generic advice in numerical computation is to avoid computing matrix inverses explicitly, and use matrix factorizations instead — normally the explicit inverse is not needed, and computing it is wasteful. The \ (backslash) operator does this for you.

PPS. If you are computing inv(X'X) in order to solve a least-square problem, beware that you have squared the condition number of X by doing so. It is safer and more accurate to solve least-squares problems by X \ b (which uses QR factorization) rather than by computing inv(X'X) * X'b or (X'X) \ (X'b). See also Efficient way of doing linear regression - #33 by stevengj