Numerical issues causing wrong determinant values

Problem

I am practicing writing efficient Julia code that could (hopefully) comparable with the function provided in libraries in term of speed. Specifically, I am trying to write logpdf function, which returns the log probability of given input.

However, when I evaluate the determinant of random vector’s covariance, i.e. \Sigma=\sigma_0^2 \mathbf{I} + \sigma_1^2 \mathbf{ZZ}^T where \mathbf{Z} \in \mathbb{R}^{n\times q}. The determinant is 0.

I think this is due to the matrix is not well-conditioned, but I am not sure how to resolve this issue.

using Random, LinearAlgebra

Random.seed!(280)
n, q = 1000, 10
Z = randn(n, q)
σ0, σ1 = 0.5, 2.0
Σ = σ1^2 * Z * Z' + σ0^2 * I
det(Σ)

Could anyone help me, thank you in advance.

Not sure if this is relavant to your actual problem (i.e. real input data) but just by the look at your generated data, I expect the matrix to mostly have eigenvalues around σ0^2 which is 0.25. This means that the determinant should be around 0.25^1000 which is simply not representable by a 64bit floating point number.

Edit: Oh, and forgot to mention that I got non-zero results when I set σ0 to 1. (without seeding the RNG)

1 Like

Try using logdet instead.

2 Likes

Also, for this specific case you can compute it much more efficiently using the Matrix determinant lemma. You can either do this directly, or make use of the WoodburyMatrices.jl package:
https://github.com/timholy/WoodburyMatrices.jl