I am seeing surprisingly large allocations from
spdiagm. The compute
times are ok for what I need to do, so perhaps this is a case where large
allocations do not harm. Here’s the story.
I compute the discrete negative Laplacian with homogeneous Dirichlet
boundary conditions on the unit square with an n x n grid.
function Lap2d(n) h=1/(n+1); maindiag=4*ones(n^2,)/(h*h); sxdiag=-ones(n^2-1,)/(h*h); sydiag=-ones(n^2-n,)/(h*h); for iz=n:n:n^2-1 sxdiag[iz]=0.0; end L2d=spdiagm(-n => sydiag, -1 => sxdiag, 0=> maindiag, 1 => sxdiag, n => sydiag); return L2d end
The sparse matrix should take rougly 5 vectors or 40 n^2 bytes.
Not exactly …
julia> n=1000; julia> @btime Lap2d($n); 129.733 ms (81 allocations: 358.46 MiB)
So I need 40 MB to store the matrix and am allocating 9 times that.
The timings are fast enough for what I need, so performance is fine.