RQ decomposition

Hi,

can anyone point me to a Julia implementation of the RQ decomposition.

Thanks.

MatrixFactorizations.jl

Iā€™m curious why you want it?

Iā€™ve never needed the RQ factorization by itself, but Givens RQ is a useful starting point to perform in-place ldiv! solves or logdet calculations with Hessenberg matrices ((H+Ā­Ī¼I) \ x solvers for Hessenberg factorizations by stevengj Ā· Pull Request #31853 Ā· JuliaLang/julia Ā· GitHub), following Henry (1994).

1 Like

Cool. I donā€™t understand your first point: I thought QR can be used in place?

I also donā€™t understand your second point but Iā€™ll have to read the reference first :sweat_smile:

Iā€™m really interested in the āˆž dimensional case where QR and QL are fundamentally different. Eg QR is unique but you can have multiple QL factorisations depending on the freehold index. I havenā€™t thought much about RQ and LQ as they are equivalent to QL and QR applied to the adjoint but perhaps Iā€™ve overlooked a nuance.

TLDR: basically, you carry out the Givens QR algorithm at the same time as you apply backsubstitution to solve Hx = b \Longleftrightarrow Rx = Q^* b, or more generally a shifted system (H - \mu I) x = b, not separately. (Both Givens RQ and backsubstitution iterate over the rows in reverse order.) This allows you to leave H unmodified (which is especially useful if you are going to repeat this process for multiple different shifts \mu) rather than overwriting it, and instead you only need auxiliary storage for one row of R (the current row for backsubstitution). (Similarly, Q is not stored, but applied one Givens rotation at a time to b and to construct the preceding row of R.)

1 Like

And the reason you need Givens instead of Householder is to avoid creating a householder vector? Interesting!

You use Givens instead of Householder in order to exploit the sparsity pattern of an upper Hessenberg matrix H, which gives you O(m^2) complexity instead of O(m^3).

Itā€™s for the same reason that Givens QR is used for QR iteration on Hessenberg matrices, in order that the QR iteration can be O(m^2) by exploiting the Hessenberg sparsity.

(Similarly, we could implement fast qr methods for Tridiagonal, UpperHessenberg, etc. using Givens rotations, but currently it seems to just fall back to the O(m^3) dense Householder algorithm.)

I donā€™t understand this. Givens is just a 2x2 Householder with a sign changeā€¦ You can easily do banded Householder and the factors matrix is bandedā€¦ I never understood the logic for using Givens for banded matrices (and I wrote the only extant banded QR code publicly available that I know of!)

It sounds like you understand it perfectly ā€” you need to use some form of 2x2 banded Householder to exploit the sparsity of an upper-Hessenberg matrix, and the conventional name for that algorithm is ā€œGivensā€.

(Whereas the unqualified term ā€œHouseholder QRā€ usually refers to the non-banded version.)

1 Like

Ah ok! Alex Townsend insisted Givens is better for banded matrices and I never understood why :sweat_smile: I guess you then agree with me that if the matrix has more bands than tridiagonal/Hessenberg then ā€œbanded Householderā€ is more sensible then Givens

Givens (skipping zero blocks) and banded Householder (matched to bandwidth & skipping zero blocks) seem like they should have the same arithmetic complexity for banded matrices, no? (Givens is certainly better than ā€œdefaultā€ non-banded Householder, maybe thatā€™s what Alex meant?)

But banded Householder might be better than Givens for large bandwidths for the same reasons that BLAS3 is better than BLAS2 (bigger blocks = better cache locality, wider SIMD, better parallelization, etcetera).

(Iā€™m afraid weā€™ve hijacked the discussion from the OP!)

1 Like

Thanks.

I was trying to perform a particular orthogonalisation of a tensor decomposition.

Note that RQ is equivalent to a QR decomposition of the transpose after reversing the order of the rows (i.e. qr(reverse(A,dims=1)')), or more simply to the LQ decomposition after reversing the rows (i.e. to lq(reverse(A,dims=1))).

julia> using LinearAlgebra

julia> A = rand(3,5);

julia> F = lq(reverse(A, dims=1));

julia> R, Q = reverse(F.L), reverse(Matrix(F.Q), dims=1);

julia> A ā‰ˆ R * Q  # check A = RQ
true

julia> istriu(R)  # check upper triangular
true

julia> Q*Q' ā‰ˆ I   # check orthonormal rows
true

This is a circular answer to the question ā€” you are saying you want the RQ factorization because you want that particular orthogonalization. But for what reason (in what context) do you need this particular orthogonalization, as opposed to e.g. LQ?

1 Like