Hi! Is there any specific module in Julia for performing linear algebra operations like Gram-Schmidt orthogonalization for a given set of vectors?
qr in the
LinearAlgebra standard library does this.
(Technically, Gram–Schmidt is just one possible algorithm for QR factorization; in practice linear-algebra libraries tend to use different algorithms instead.)
Hi! thanks for the reply, however, could you be a bit more elaborate?
Specifically how exactly do I obtain the orthonormalized vectors from the given set of vectors?
P.S ~ I knew nothing about QR decomposition to this point.
using LinearAlgebra to load the LinearAlgebra module, then do
Q = Matrix(qr(A).Q) where
A is a matrix whose columns are the vectors you want to orthonormalize. This yields a matrix
Q whose columns are the orthonormalized columns of
A, equivalent (modulo roundoff errors) to Gram–Schmidt on the columns of
A (also called the “thin” QR factor).
Thanks for clarifying! I’m new to Julia and didn’t knew this stuff before now
No problem. This is not specific to Julia, but a lot of introductory linear-algebra courses don’t emphasize the connection of Gram–Schmidt to QR factorization.
Often, introductory linear algebra classes tend to emphasize simple (perhaps simplistic) algorithms for hand calculation, whereas practical numerical linear algebra emphasizes matrix factorizations. For example, LU factorization (instead of Gaussian elimination), QR factorization (instead of Gram–Schmidt), and diagonalization (instead of finding roots of characteristic polynomials).
Moreover, one often doesn’t work with the factors directly as matrices, but instead treats them as linear operators. For example, when working with QR factors (i.e. orthogonalized bases), typically one doesn’t need to look at the elements of Q explicitly, but instead one only needs to multiply by Q or Q^T. Julia’s
qr(A).Q is actually an object that doesn’t literally store the elements of Q, but can be multiplied quickly by vectors and matrices. That’s why
Matrix(qr(A).Q) is required if you want an explicit matrix of orthogonal vectors — in large-scale problems, you typically try to avoid this and instead work with Q implicitly as an operator.