Gram-Schmidt decomposition

Note that this is not faster in the complexity sense — all of these algorithms are \Theta(mn^2) for an m \times n matrix A. But it is probably faster in actual time (i.e. a better “constant coefficient” of the mn^2) for m \gg n, similar to solving least-squares problems by the normal equations as discussed in Efficient way of doing linear regression - #33 by stevengj … at the expense of accuracy as you point out.

Technically, there is no such thing as “not as stable”. It’s a boolean choice: an algorithm is either numerically stable or not. It’s a question of whether the (backward) error goes to zero (at worst) linearly with the precision \varepsilon or not, i.e. whether the backward error is O(\varepsilon).

This algorithm (which squares the condition number of A) is definitely much less accurate than Householder QR or modified Gram–Schmidt. I don’t know offhand whether it is numerically unstable (i.e., its backward errors decrease more slowly than linearly), but my guess it is that it is probably unstable.

Aside: One should get out of the habit of using matrix inverses for numerical calculations. One should instinctively do A / U rather than A * inv(U). In this case, A / U exploits the fact that U is upper triangular to avoid the O(n^3) complexity of inversion.

6 Likes