Solving a Shifted Linear System of 2 Symmetric Positive Semi Definite (SPSD) Matrices

There are a few options, but it is a harder problem and those options aren’t as nice as for A-\lambda I. I’m assuming that you aren’t trying to exploit sparsity. You could try treating this as a symmetric semi-definite generalized eigenvalue problem to simultaneously diagonalize A and C with a common congruence formed from generalized eigenvectors. The stability of that is not perfect, however. The eigenvector matrix can be ill-conditioned and if C is semidefinite instead of positive definite, you are not even guaranteed a complete set of eigenvectors.

What I have done before is to just to give up on exploiting symmetry. If you compute a Hessenberg-triangular form of A and C:

Q^T A Z = H, \qquad Q^T B Z = R

where H is upper Hessenberg and R is upper triangular. Q and Z are orthogonal. Your problem transforms to

(H + \lambda R) \mathbf{y} = Q^T A^T \mathbf{b}

where \mathbf{y} = Z^T \mathbf{x}. Each choice of \lambda then gives a Hessenberg system to solve, which is more efficient. The LAPACK routine for Hessenberg-triangular reduction is xGGHRD. Unfortunately, there doesn’t appear to be an existing Julia interface to that routine. But it probably isn’t too hard to write. I’ve found the existing Julia interfaces not that hard to follow as a template for accessing other LAPACK routines. Although, to be honest, last time I had a problem like this, I was too lazy to do that and just used schur(A,C), so it had to run the full QZ algorithm. It was mostly something I was doing for some test code and it slowed down the tests, but I could live with that.

There is a symmetry preserving reduction that will use a congruence to make A tridiagonal and C diagonal, which is a lot closer to the A-\lambda I case. But you give up on orthogonality of the congruence. There is a description in this paper. I’d trust the non-symmetric approach more for stability.

I don’t know what options there are that avoid condition squaring for the regularized least squares problem. I have actually thought a little about that before, but didn’t come up with anything and I haven’t seen anything published. I’m not sure that it’s reasonable to expect to do this. Using orthogonal transformations would automatically preserve symmetry when you form D^T D and E^T E and symmetry and orthogonality don’t seem to go together for this type of reduction. So you can probably do something, but I wouldn’t expect it to be fully based on orthogonal transformations and there might be stability concerns.

2 Likes