Solving a Shifted Linear System of 2 Symmetric Positive Semi Definite (SPSD) Matrices

One option is to first Cholesky-factor C = U^T U, let y = U x, then rewrite the system as:

(\underbrace{U^{-T} A U^{-1}}_{B} + \lambda I) y = U^{-T} A^T b = c

Note that the matrix B is still SPSD. Note also that, as usual, you don’t actually invert any matrices: you compute B via triangular solves B = U' \ A / U.

Once you have it in this form, you can call hessenberg(Hermitian(B)) to factorize B into Q H Q^T, where H is symmetric tridiagonal. The advantage of this is that you can then do shifted solves without repeating the factorization, since B + \lambda I = Q (H + \lambda I) Q^T.

(Diagonalizing B into eigenvectors would also accomplish this, but is more expensive. On the other hand, diagonalization results in a faster solve step, so it depends on how many \lambda you need to solve with.)

Moreover, the Hessenberg factorization objects in Julia have efficient specialized support for shifted solves, for exactly this sort of application.

In short, you can do something like:

# pre-processing
U = cholesky(C).U # UpperTriangular
B = Hermitian(U' \ A / U)
c = U' \ (A'b)
F = hessenberg(B) # tridiagonal

# solve for each λ
λ = 0.1234
y = (F + λ*I) \ c
x = U \ y

# check:
@show x ≈ (A + λ*C) \ (A'b)  # should print "true"

Note that F + \lambda*I is actually lazy — it doesn’t even allocate a new matrix. And you can do the y and x solves in-place too, if you have pre-allocated vectors x and y:

ldiv!(y, F + λ*I, c)
ldiv!(x, U, y)

Usually if you are doing Tikhonov regularization, the resulting system should be well-conditioned, no?

If it’s ill-conditioned, you could equivalently solve the ordinary least-squares problem

\min_x \left\Vert \begin{pmatrix} D \\ \sqrt{\lambda} E \end{pmatrix} x - \begin{pmatrix} b \\ 0 \end{pmatrix} \right\Vert_2

by QR as usual (\ in Julia), but I’m not sure if there is a way to re-use calculations for multiple \lambda in that case.

2 Likes