How to solve this Ax=b faster?

here I uploaded a JLD2 file and a small script that runs the tests.

following the output of the script

loading variables ... 
A, r, 
size(A) = (162372, 162372)
ishermitian(A) = true
norm(r) = 9027.184755276294

doing u = A\r ...
  3.878 s (52 allocations: 1.39 GiB)
norm(r - A * u) = 5.766109468498555e-12

doing u = lu(A)\r ...
  13.797 s (72 allocations: 2.39 GiB)
norm(r - A * u) = 4.227052608619862e-12

doing LinearSolve.solve(prob, UMFPACKFactorization()) ...
  12.981 s (94 allocations: 2.71 GiB)
norm(r - A * sol.u) = 4.227052608619862e-12

doing LinearSolve.solve(prob, KrylovJL_CG()) ...
  70.360 s (50 allocations: 179.79 MiB)
norm(r - A * sol.u) = 0.00013389304531825536

you can tell KrylovJL_CG() is a conjugate gradient method by looking at how little memory it allocates, I would also have expected a CG method would have been faster on a SPD problem (probably the the matrix is not big enough to outperform a very well optimized direct method)

1 Like