Defining a preconditionner for IterativeSolvers

Dear all,
I would like to solve a linear(ised) system of equations using IterativeSolvers, however I am not sure how to define the preconditioner.
The preconditonner is a symmetric positive-definite sparse matrix and the action of the preconditonner is defined in 2 steps: (1) compute Cholesky factorisation of the preconditionner (done before calling the iterative solver), (2) apply Cholesky factors during the iterative solver (backsubstitutions).
I understand that the action of the preconditionner is dealt internally with in-place backsubstitutions, which is great.
But how should I input the Cholesky factors (SuiteSparse.CHOLMOD.Factor{Float64}) to say, gmres! or bicgstabl!? Can I simply do this?:

PC_fact = cholesky(PC) # Factor preconditionner
gmres!(x, A, b; Pl=PC_fact)

So far I use a self programmed gcr! translated from an earlier MATLAB code (where I actually hard-coded the action of the preconditionner) but I am confident that I’d be better off using solvers from IterativeSolvers.jl :smiley:

Yes, I think so. According to the docs, the preconditioner P is used by IterativeSolvers.jl via P \ x etcetera, and cholesky(A) in Julia returns a factorization object that supports \ and ldiv! (out-of-place and in-place left division), implemented via back/forward-substitution.

1 Like

Hi, this probably would work, but cholesky(A) is a full factorization. For a Preconditioner see e.g. IncompleteLU.jl or AlgebraicMultigrid.jl.

The point is that a preconditioner M for a matrix A is much simpler to factorize than the full matrix A. In practice one often doesn’t represent M, but just the factorization.
Also, cholesky is for symmetric matrices, in that case you can use cg instead of gmres.

I have some material on this:

For a new take on sparse system solution which incorporates all these concepts and packages see LinearSolve.jl.


My understanding is that the original poster is factorizing some matrix P different from the original matrix A. It can be very sensible to use the full factorization of P in this case if P is much cheaper to factorize than A (e.g. P is banded).

Thanks for both of your inputs!

@steveng : exact, I’m solving for a Newton step but the Jacobian is not symmetric. Hence, I use the Cholesky factorisation of the system resulting from Picard linearisation (symmetric positive definite). To my knowledge, it’s an efficient way to solve 2D non-linear problems (FDM/FEM). Not a candidate for 3D though…
@j-fu : thanks for the suggestion, I’ll definitely try the available ILU and AMG once I’m back at work.

@Gravlax ldiv!(y, F, x) is not implemented for F::SuiteSparse.CHOLMOD.Factor{Float64}.
You can use LDLFactorizations.jl, if you want to use ldiv! with the factorization of a symmetric (positive definite) matrix.

I give some examples with preconditioners in my package Krylov.jl if it can help you:

The examples should also work with IterativeSolvers.jl.

1 Like

Oh nice, that’s indeed very useful. If I understand well, you manage to overcome the ldiv!/cholesky factor issue by “overloading” or adding a method to ‘ldiv!’ with the following line, right?

ldiv!(y::Vector{T}, F::SuiteSparse.CHOLMOD.Factor{T}, x::Vector{T}) where T = (y .= F \ x)

Yes, you’re right.
Note that we allocate a new vector at each call of ldiv! for F \ x.
It’s not a “real” ldiv! routine.
We can’t do better because the in-place backward and forward sweeps with permutations are not implemented in CHOLMOD

Hi there, I’ve ran into a similar issue recently, but even factorisations that do provide in-place ldiv! method (such as UMFPACK’s LU decomposition) turn out to allocate quite a lot. However looking into UMFPACK’s documentation, it looks like there it defines a set of functions (umfpack_*_wsolve) that takes a pointer to a preallocated buffer from the user (and therefore does not allocate itself).

I’m curious as to how hard it would be to expose this feature in the Julia wrapper, much like what FastLapackInterface does for LAPACK (only problem is that the matrix formats LAPACK supports is quite limited…).


I remember that someone fixed this problem recently.

We will have this problem solved with the future releases of Julia.


Thanks @amontoison that’s exactly what I was looking for!

I opened an issue about the problem with CHOLMOD in DrTimothyAldenDavis/SuiteSparse.

Great, thanks a lot for this. I did not understand that the problem was inherent to CHOLMOD. I thought it was related the way it was wrapped in Julia.
That said, it is true that when using the library in C:
s = cholmod_solve (CHOLMOD_A, Lfact, f, c);
s is internally allocated. So, it’s probably a similar issue.