All positive eigen values but false in isposdef?

Hello, I was curious about my Julia code spitting out negative variance for some variations in sandwich form vcov matrices (in Fixed Effects, clustering etc.).
Let’s say my matrix of control variables, X.
I found out that (X’X)^(-1) spits out positive eigenvalues for all variables, but still show false for isposdef(inv(X’ * X)).
How could this be possible? As far as I know, eigenvalues being positive is if and only if a relation with positive definiteness of the matrix if the matrix is symmetric.
Can someone help what’s going on here?

ps. I found out that the inverse of the matrix is not always symmetric in Julia. Is there a typical way to fix this?

FYI,
X’*X = [1800 900 18900 8955 900 14400;
900 900 13950 4410 450 7200;
18900 13950 258300 92595 9450 151200;
8955 4410 92595 63645 4440 71640;
900 450 9450 4440 900 7200;
14400 7200 151200 71640 7200 148800]

The problem is that isposdef also requires the matrix to be Hermitian. Your matrix X’*X is indeed Hermitian (real-symmetric), but because of roundoff errors its inverse is not:

julia> A = [1800 900 18900 8955 900 14400;
       900 900 13950 4410 450 7200;
       18900 13950 258300 92595 9450 151200;
       8955 4410 92595 63645 4440 71640;
       900 450 9450 4440 900 7200;
       14400 7200 151200 71640 7200 148800];

julia> isposdef(A)
true

julia> isposdef(A^-1)
false

julia> ishermitian(A^-1)
false

julia> A^-1 - (A^-1)'
6×6 Matrix{Float64}:
  0.0           7.37257e-18  -8.13152e-19   1.0842e-19    1.51788e-18   3.79471e-19
 -7.37257e-18   0.0           0.0           3.35425e-19   1.73938e-18   5.42101e-19
  8.13152e-19   0.0           0.0          -3.854e-20    -1.74515e-19  -5.42101e-20
 -1.0842e-19   -3.35425e-19   3.854e-20     0.0          -1.06726e-19  -1.33408e-20
 -1.51788e-18  -1.73938e-18   1.74515e-19   1.06726e-19   0.0           0.0
 -3.79471e-19  -5.42101e-19   5.42101e-20   1.33408e-20   0.0           0.0

You can fix this by telling Julia that the matrix is Hermitian, so that this property is enforced through matrix inversion etcetera, just by wrapping it in the Hermitian type.

julia> H = Hermitian(A);

julia> ishermitian(H^-1)
true

julia> isposdef(H^-1)
true

PS. The generic advice in numerical computation is to avoid computing matrix inverses explicitly, and use matrix factorizations instead — normally the explicit inverse is not needed, and computing it is wasteful. The \ (backslash) operator does this for you.

PPS. If you are computing inv(X'X) in order to solve a least-square problem, beware that you have squared the condition number of X by doing so. It is safer and more accurate to solve least-squares problems by X \ b (which uses QR factorization) rather than by computing inv(X'X) * X'b or (X'X) \ (X'b). See also Efficient way of doing linear regression - #33 by stevengj

PPPS. Quote your code: PSA: how to quote code with backticks

9 Likes

Wow, you solved my problem and also gave really relevant advice!
I’ll do that. Thanks a lot!