# Isposdef() true with small negative eigenvalues?

Hi all, Iβm puzzled by eigen() and isposdef() acting on r'r for this matrix r

using LinearAlgebra

r = [
0.0   0.0  0.0   0.0   0.0  0.0   0.0   0.0  0.0
-1.0   1.0  0.0   0.0   0.0  0.0   0.0   0.0  0.0
1.0  -2.0  1.0   0.0   0.0  0.0   0.0   0.0  0.0
0.0   0.0  0.0   0.0   0.0  0.0   0.0   0.0  0.0
0.0   0.0  0.0  -1.0   1.0  0.0   0.0   0.0  0.0
0.0   0.0  0.0   1.0  -2.0  1.0   0.0   0.0  0.0
0.0   0.0  0.0   0.0   0.0  0.0   0.0   0.0  0.0
0.0   0.0  0.0   0.0   0.0  0.0  -1.0   1.0  0.0
0.0   0.0  0.0   0.0   0.0  0.0   1.0  -2.0  1.0
]


now if we were to do r'r, this has very small negative eigenvalues

julia> Ξ» = eigen(r'r).values
9-element Vector{Float64}:
-1.969708422286297e-16
-1.969708422286297e-16
-1.969708422286297e-16
0.39444872453601043
0.39444872453601043
0.39444872453601043
7.605551275463979
7.605551275463979
7.605551275463979


but why then is isposdef(r'r) == true ?

julia> isposdef(r'r)
true


Shoudnβt those zero or negative eigenvalues make positive definiteness for r'r untrue? On further investigating, there are some small singular values in r,

julia> svd(r).S'
2.75782  2.75782  2.75782  0.628052  0.628052  0.628052  3.73941e-17  3.73941e-17  3.73941e-17


and as expected for r'r

julia> svd(r'r).S'
7.60555  7.60555  7.60555  0.394449  0.394449  0.394449  2.22776e-16  2.22776e-16  2.22776e-16


but what makes isposdef true? Any insight will be much appreciated.

isposdef and similar functions like rank ignore small values. So even though you see a couple of negative eigenvalues their magnitude is below the machine epsilon and thus they will be treated as 0. I donβt know where exactly the cutoff is though.

Blockquote
is below the machine epsilon and thus they will be treated as 0

for positive definiteness, donβt the eigenvalues have to be greater than zero?

So basically your matrix is singular, which means that the tiniest deviation caused by floating point inaccuracy can change its character. What isposdef does is to try and compute a Cholesky decomposition and in fact this succeeds:

julia> cholesky(r'r)
Cholesky{Float64, Matrix{Float64}}
U factor:
9Γ9 UpperTriangular{Float64, Matrix{Float64}}:
1.41421  -2.12132    0.707107    0.0      β¦  0.0       0.0        0.0
β        0.707107  -0.707107    0.0         0.0       0.0        0.0
β         β         2.10734e-8  0.0         0.0       0.0        0.0
β         β          β          1.41421     0.0       0.0        0.0
β         β          β           β          0.0       0.0        0.0
β         β          β           β       β¦  0.0       0.0        0.0
β         β          β           β          1.41421  -2.12132    0.707107
β         β          β           β           β        0.707107  -0.707107
β         β          β           β           β         β         2.10734e-8


So from a numerical point of view your matrix might as well be positive definite. Probably the safest way to check for semi-definiteness is to also examine the eigenvalues/singular values and see whether they are large enough.

julia> diag(ans.L)
9-element Vector{Float64}:
1.4142135623730951
0.7071067811865481
2.1073424255447017e-8
1.4142135623730951
0.7071067811865481
2.1073424255447017e-8
1.4142135623730951
0.7071067811865481
2.1073424255447017e-8


You see 3 values of order \sqrt{eps} which means that the original matrix might as well be singular (and it is in this case). Generally determining exactly whether a matrix is singular is a difficult task numerically and essentially impossible. Best you can do is compute some eigen-/singular values and use a cutoff to decide which should be zero. This is what rank does e.g.

julia> rank(r'r)
6


So to summarize: The title is not quite true. Your matrix is singular which is hard to detect. Perhaps the docstring of isposdef should warn about this more.

1 Like