Random Orthogonal Matrices

dear julia experts—has anyone already written a random orthogonal matrix generator for square matrices?

Is the right way to do this generating a random normal matrix first and then running a Gram-Schmidt orthogonalization? (is this somewhere in Julia already, so that I won’t rewrite the wheel)?

pointers appreciated. regards, /iaw

2 Likes

You can generate a random orthogonal matrix Q with

A = rand(n,n)
Q, R = qr(A)
5 Likes

mille grazie. this was too easy!

Abel’s solution gives you a more-or-less random matrix which happens to be orthogonal. (Nothing wrong with that, if that’s all you need.)

If you need a random draw from the uniform distribution over the space of orthogonal matrices of rank n, I think you want something like

using RandomMatrices
Q = rand(Haar(1),n) 
# or randfast(Haar(1),n) if you are impatient and not fastidious

I remember being annoyed at how hard this was to find.

17 Likes

even better indeed. thank you.

the RandomMatrices installs in 0.6.2, but the brews for macOS abort somewhere, and there are deprecation warnings in the RandomMatrices code (e.g., use exp.() instead of exp() ). so the install and the code could use some tender care.

I am looking at the documentation. I am way over my head on the underlying math.

What I really want is K random directions with maximum distances in N dimensions (N<=K). So, if I have 2 directions and I want 4 directions, it would be a rotate of (1,0),(0,1),(1,1),(-1,1). If by any chance this is already in this package, please let me know. Otherwise, please ignore.

I am definitely ok with what I know and what I have. I can draw two matrices, and hope that they will not happen to be highly correlated. It’s a little less efficient, but it will do.

regards,

/iaw

An easy way to draw from the Haar measure is to take the QR of a matrix of iid random normals, with R constrained to have a positive diagonal:

Q,R = qr(randn(m,n))
O = Q*Diagonal(sign.(diag(R)))

There are even faster ways to do this: see Orthogonal matrix - Wikipedia

9 Likes

this is nice!

Note that this answer, which is currently marked as the accepted solution, does not produce uniform (Haar) distributed answers. For instance, it will always produce Q matrices with elements of the same sign in the first column. Simonbyrne’s answer is the correct one.

Another remark is that (Julia’s Householder-based) QR is more stable than Gram-Schmidt which OP was considering.

3 Likes

Indeed — see figure 4.1 of

for a quick sim on how to verify what @fph shared.

2 Likes

That link is dead.Is there an updated version available?

A. Edelman and N.R. Rao. Random matrix theory. Acta Numerica, 14(233-297):139, 2005.

3 Likes

In the accepted solution, Q is a m\times m matrix. Is the distribution of Q independent of n? If not, what is the dependency of Q on n?

A = rand(m,n)
Q, R = qr(A)

with m \geq n will produce a random m \times n Q with orthonormal columns taken from the same measure as in the square case. The QR algorithm works sequentially on the columns of A to build corresponding columns of Q and R. So the first n columns of Q and R will be the same as they would if A were extended to be square with A_{ij}, j \leq n held constant.

Thank you very much for your answer! Extrapolating implies that for increasing n once m\le n, the Q matrix does not change anymore? Hence, with focus on sampling Q m\times m-matrices, there is no reason to go beyond m=n?

I recently digged into methods of uniform sampling from the group of unit quaternions, and I stumbled upon this good reference, that suggests a method to ensure that the probability distribution of the generated orthogonal matrix is the Haar measure: Francesco Mezzadri: How to generate random matrices from the classical compact groups, Notices Amer. Math. Soc. 54(5), 592–604, 2007.

4 Likes