QR factorization with custom dot product

Is there a way of passing a custom dot product to the QR factorization routine? In my application, the column vectors are the coefficients of a basis of continuous functions and the dot product is the integral over a volume.

EDIT: SVD factorization would also work. The ultimate goal is to solve a least squares problem.

Sounds like a generalised eigenvalue problem to me. (which can be done in any eigenvalue solver I think); but if you want to avoid the normal equations then I’m not sure it is implemented?

You may want the generalized QR factorization, which can be used to solve weighted least-squares / GLM problems of the sort you describe. The good news is that solvers for this exist in LAPACK, which is included with Julia. The bad news is that it looks like no one has written nice Julia wrappers for the *GGGLM functions that you probably want to call, so you’d have to write them yourself.

Alternatively, you can do a change of variables via the Cholesky factorization. Suppose you want the “QR” factors of A with a custom dot product x’By, i.e. weighted by a positive-definite matrix B (the Gram matrix of your basis, in your case). That is, you want the “weighted unitary” matrix Z such that Z’BZ=I and A=ZR, where R is upper-triangular. The trick is to use the Cholesky factorization B=U’U of B, in which case Z’BZ=(UZ)'(UZ) = I, so Q=UZ is unitary and UA = (UZ)R is an “ordinary” QR factorization. Hence:

  1. Cholesky factor B = U’U with U=chol(B)
  2. Find the QR factorization of UA with QR = qrfact(U*A)
  3. You can find Z if needed with Z = U\QR[:Q], although in practice you may want to work with Z implicitly as described below.

For example, if you want to solve the weighted least-squares problem, to minimize |Ax - b| in your B norm, it corresponds I think to solving the normal equations A’BAx = A’Bb. Plugging in A=ZR, we get R’Z’BZRx=R’Z’Bb, and after cancellations we find the well-conditioned equations Rx=Z’Bb. Writing B=U’U and Z=U\Q, this reduces to Rx=Q’Ub, which is the least-squares solution for UA=QR with right-hand side Ub. Given the above code, the least-squares solution x is then simply x = QR \ (U*b), or even simpler we could condense all of the steps to:

U = chol(B)
x = (U*A) \ (U*b)

since \ solves the least-square problem for a non-square A.

I’m just typing this off the top of my head, so hopefully there are not too many screw-ups in the above algebra, but I think the general idea is right. I wouldn’t be surprised if the Cholesky approach were faster than the LAPACK GLM solver, since it is specialized for symmetric positive-definite B whereas the GLM solver handles more general cases.

1 Like

There is a placeholder package JuliaMatrices/LAPACK.jl, with the intention that this will be a common place for all wrapped LAPACK routines. This package is currently empty, however, it could be a place to put ggglm

Interesting solution: if I understand correctly, you absorb the dot product in A and b by making use of the Cholesky factorization of B = U'U (where U is an upper triangular matrix). Then, you solve the problem min ||Ax -b||_B, by solving instead min ||U*Ax-U*b||_2 and the algorithm is backward stable because Cholesky and QR factorizations are stable.

It looks like the method is not going to work, but not because of the custom dot product.

The matrix A has continuous functions {f_i} as columns and to form Ax-b, one would need to compute the terms \int f_i f_j dV which are the elements of the matrix A'*A. At this point, one could as well solve the problem using the normal equation.