I’m exploring the idea of solving large, sparse symmetric positive definite (SPD) linear systems using hybrid approaches that combine direct methods (e.g., Cholesky) with iterative methods (e.g., CG with incomplete Cholesky), switching automatically depending on the situation (fill-in turns the Cholesky factor matrix into dense, for instance).
In most environments, you have to manually try a sparse Cholesky, monitor fill-in or memory usage, and fall back to an iterative method if things go bad. But Julia seems uniquely well-suited to automate this kind of logic:
- Symbolic inspection of sparsity patterns (Symbolics.jl ?)
- Explore Multiple orderings (METIS, AMD, rCM, etc.)
- Runtime decision-making (BenchmarkTools.jl)
- JIT compilation for custom solver paths
- Dynamic precision (DoubleFloats, ArbFloat)
- GPU fallback using KernelAbstractions.jl, CUDA.jl
- Seamless switching between CHOLMOD, IterativeSolvers.jl, Krylov.jl, etc.
The goal would be a black-box solver that tries Cholesky, and if the symbolic fill-in is too high or memory usage is problematic, transparently switches to an iterative method with a suitable preconditioner — potentially even moving to GPU and lower-precision arithmetic for cheaper iterations, if needed.
Is anyone working on something like this? Any recommendations for existing tools or patterns to build such an adaptive solver? I’d be interested in contributing or collaborating if there’s interest.
NB: This post was drafted with help from ChatGPT, but I take full responsibility for its content.