Generic Iterative Solvers: Krylov's methods example

Hi,

This is related to a discussion within the lengthy tread that spun off Yuri’s blog post. I don’t want to make that thread even longer with off-topic material.

In that thread, when discussing Julia’s ecosystem @juliohm asked why we have 3 packages for iterative solvers, and he cited IterativeSolvers.jl, Krylov.jl, and KrylovKit.jl.

As I’m very interested in iterative solvers and this kind of “package ecosystem” issues, I dug a little bit into this and I think I found the answer in KrylovKit docs:

KrylovKit accepts general functions to represent the linear map or operator that defines the problem, without having to wrap them in a LinearMap or LinearOperator type. Of course, subtypes of AbstractMatrix are also supported. If the linear map (always the first argument) is a subtype of AbstractMatrix, matrix-vector multiplication is used, otherwise, it is applied as a function call.

The magic is, of course, achieved via multiple-dispatch.

See the extremely simple and elegant apply.jl code in that package, which contains lines like:

apply(A::AbstractMatrix, x::AbstractVector) = A * x
apply(f, x) = f(x)

which are then used:

function GMRES(operator, ...)
    y₀ = apply(operator, x₀)
    (...)
end

I believe this is both a pretty elegant example of multiple dispatch (actually, I’m going to make this my example of multiple-dispatch in my blog’s tutorial (I still don’t have one) and that this is also the most “Julian” way to write an Iterative solver, so I guess I’m going to start recommending this package as well in my numerical computing tutorial (after testing it, of course).

Now, a few thoughts about the ecosystem.

As Julia developed, many great package names were available, like “IterativeSolvers”, which will win Google rankings in a trivial way (by direct keyword hit). This package is also maintained by an organization called JuliaLinearAlgebra, which lead one to think that, somehow, IterativeSolvers.jl is the “official” Julia package for iterative solvers.

However, I believe InterativeSolvers.jl is simply the oldest package, and, for some reason, the creators of both Krylov.jl and KrylovKit.jl decided to create a separate package instead of creating PRs to the existing one, maybe because that was easier.

As for Krylov.jl, they list in their docs:

All solvers in Krylov.jl have an in-place version, are compatible with GPU, and work in any floating-point data type.

So I guess it was one of those features that made them deviate from the other packages.

I guess in situations like this, the different package authors can learn from each other, and yes, eventually it would be desirable that someone merges all the contributions into a single package, to avoid duplicated efforts, and to give all users the best possible experience.

However, I’m not really sure how this could be accomplished. Maybe it’s just a matter of letting the individual contributors decide to which package they are willing to contribute to, and let the more motivated/friendlier package maintainer summarize the best efforts.

3 Likes

The advantage of KrylovKit is that it allows vectors not <: AbstractArray plus being compatible with GPUs. This was my main motivation for using it in BifurcationKit.jl. The con is that it is out of place.

Now, we have LinearSolve.jl

2 Likes

Cool, it looks like you’ve wrapped every package for solving linear systems out there!

What I truly like about KrylovKit.jl is that you accept “anything” that has an associated “apply” function to it, and you then provide such apply functions for matrices or functions. A user could even define it’s own type, code an “apply” function, and forward the type directly to your solver. I guess that is truly generic.

For example, I can do:

using KrylovKit

iter_A(x) = [ -2x[1] + x[2]; x[1:end-2] - 2x[2:end-1] + x[3:end] ; -2x[end] + x[end-1] ]
xt = rand(1000)
b = iter_A(xt)

sol = linsolve(iter_A,b)

Now, this runs, but it doesn’t converge (I’m not sure what’s your default algorithm here, but that was not my main point anyway)

At first sight, this kind of interface doesn’t seem to be possible with LinearSolve:

using LinearSolve

iter_A(x) = [ -2x[1] + x[2]; x[1:end-2] - 2x[2:end-1] + x[3:end] ; -2x[end] + x[end-1] ]
xt = rand(1000)
b = iter_A(xt)

prob = LinearProblem(iter_A, b) 

ERROR: MethodError: no method matching size(::typeof(iter_A), ::Int64)

So I guess that in LinearSolve you still need to set up something like LinearMaps somehow?

Anyway, I’m sticking to whatever gives me unrestarted GMRES with a custom number of iterations, and it could be KrylovKit if I get to configure it correctly, or IterativeSolvers as I’m already using it.

I did no write that package

You need to use a SciMLOperator. The interface package is being separated out soon into GitHub - SciML/SciMLOperators.jl, and the documentation won’t be up until the end of the week but will be mostly the same as DiffEqOperators · DifferentialEquations.jl . The problem is that LinearMaps ended up being useless in most solver contexts because it is not an expressive enough API, so we just couldn’t use it in differential equation contexts and necessitated creating a whole interface, which LinearSolve.jl expects.

Anyways, we planned to have it finalized by the end of the week and are on track so we’ll have an update rather soon.

1 Like

Thank you, Chris. I’ll keep an eye on LinearSolve then and wait for these updates. It looks great as a reference package implementing every possible method out there.

Doesn’t IterativeSolvers.jl equivalently accept any object that defines LinearAlgebra.mul! (which has the advantage that it gives you an in-place interface)? Similarly for Krylov.jl.

1 Like

Yes. The bigger issue is that the preconditioner interfaces are not aligned well though.

The above answers show you that the real bonus of KK is the state space which is not an AbstractArray.
Another pro of this library is the eigensolver.

1 Like

The other main packages for non-Hermitian iterative eigensolvers are ArnoldiMethod.jl and Arpack.jl