Can Julia or some existing package simulate random covariance (symmetric + positive definite) matrices?
I could do
using LinearAlgebra
function random_covariance_matrix(n::Int)
x = rand(n, n)
xsym = 0.5(x + x') # make x symmetric
return xsym + n*I # diagonally dominant matrices are positive definite
end
A = random_covariance_matrix(3)
isposdef(A)
issymmetric(A)
or something similar, but I’m not sure if diagonal dominance (or something similar) is somewhat more special than a typical covariance matrix.
If you want the distribution to be invariant under orthogonal rotations (just like the uniform distribution over real numbers is invariant under translations), then you want the Haar distribution, which you can sample from via
Q, R = qr(randn(n, n))
Q = Q*Diagonal(sign.(diag(R)))
Edit: whoops, I completely mixed up the question, my answer is for generating orthogonal matrices, not PSD. They’re connected in my head because you can generate a random PSD matrix by taking such a Q and using it as the matrix of eigenvectors of your PSD matrix, eg
Q * Diagonal(x*rand(n)) * transpose(Q)
where x is how large you want the largest eigenvalue to possibly be.
if you aren’t concerned about the values, only that it’s symmetric + positive definite, you can use A = let X = randn(n, n); X * X' + I; end (which is what we use for tests in ChainRules)
GOE, that is GaussHermite(1) is random symmetric with Gaussian entries. It’s a lot faster, O(n^2) to sample the eigenvalues to answer the question of positive definite ness