KernelMatrices.jl - A software package for working with hierarchical matrices

If you end up comparing against multiple implementations of the ACA, would you mind benching the one in KernelMatrices against it and let me know how things look? At the time that I released the package it definitely wasn’t competitive, but the package has been quietly improving since then, so I am somewhat hopeful that now it might hold its own against more sophisticated implementations, particularly if you give the ACA a KernelMatrix object instead of a dense Matrix, although the function accepts both.

Example calls to KernelMatrices’ ACA:

using KernelMatrices

# Actually low rank:
truU  = randn(1024, 100)
truV  = randn(1024, 100)
A1    = truU*truV'
(U,V) = KernelMatrices.ACA(A1, 1.0e-8) # size(U,2) \approx 100. 

# Should be full rank, but for numerical reasons isn't (kernel is analytic everywhere/"entire").
# compared to LowRankApprox.jl:
using LowRankApprox
rand_pts = [randn(2) for _ in 1:1024]
A2  = [exp(-norm(x-y)^2) for x in rand_pts, y in rand_pts]
for rnk in (50, 150, 250)
  (fU,fV) = KernelMatrices.ACA(A2, 1.0e-16, rnk) 
  (pU, pS, pV) = psvd(A2, rank=rnk)
  println("Rank $rnk:")
  println("pSVD opnorm difference: $(round(opnorm(A2-pU*Diagonal(pS)*pV'), digits=5))")
  println("ACA  opnorm difference: $(round(opnorm(A2-fU*fV'), digits=5))\n\n")
end
1 Like