QR decomposition with julia. How to?

This is even worse than just inverting R (which you should also generally avoid) — by putting the parentheses like this, you are doing a matrix–matrix multiplication followed by a matrix–vector multiplicatio rather than two matrix–vector multiplication. (AB)x is much slower than A(Bx) when A and B are matrices and x is a vector.

If your matrices are small and you don’t care about performance, then maybe you don’t care, but you really want to get into the habit of thinking more carefully about this kind of thing in case your matrices ever get big. In any case, there are better ways to solve least-square problems as I explained above.

PS. That being said, inv(R) * Q' does automagically pick the right size of Q for you — it ends up corresponding to \hat{R}^{-1} \hat{Q}^* from the thin QR. But it’s even easier to do QR \ Y or X \ Y, which do the right thing.

4 Likes