Thanks for your help @cgeoga. This is a varinfo()
of my current session (almost a fresh REPL):
name size summary
ββββββββββββββββ βββββββββββ βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
A 58.802 MiB 172696Γ1301264932900 adjoint(::SparseMatrixCSC{Float64, Int64}) with eltype Float64
Ap 58.802 MiB 1301264932900Γ172696 SparseMatrixCSC{Float64, Int64}
Base Module
Core Module
InteractiveUtils 261.437 KiB Module
L1DIR 63 bytes 55-codeunit String
L2DIR 63 bytes 55-codeunit String
L2SYM 66 bytes 58-codeunit String
Main Module
ans 1.419 KiB Method
b 3.242 KiB 172696-element reshape(::SparseMatrixCSC{Int64, Int64}, 172696) with eltype Int64
c 136 bytes 1301264932900-element SparseVector{Float64, Int64}
mmread2primal 0 bytes typeof(mmread2primal)
mmread_threaded 0 bytes typeof(mmread_threaded)
symmetrize 0 bytes typeof(symmetrize)
v 360 bytes 172696-element SparseVector{Float64, Int64}
In the left division M\v
, v
is then a sparse vector of length 172696 and M
is should be a sparse matrix of size 172696 by 172696
(I dont actually know the density but I think should be sparse, and even if not a dense matrix can still fit into my memory)
I think you are probably right with the density part. Before I did a sloppy test with sparse matrix multiplication of two sparse matrices created by sprand
and it works fine for square matrices of size 1b. Now I do more accurate testing: with density
the same density as my spare matrix input
fakeA = sprand(1301264932900, 172696, density); #ok no problem
fakeA = sprand(172696, 1301264932900, density); # SWITCH SIZE
ERROR: OutOfMemoryError()
Stacktrace:
[1] Array
@ .\boot.jl:448 [inlined]
[2] sparse_sortedlinearindices!(I::Vector{Int64}, V::Vector{Float64}, m::Int64, n::Int64)
@ SparseArrays C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SparseArrays\src\sparsematrix.jl:1562
[3] sprand(r::Random.MersenneTwister, m::Int64, n::Int64, density::Float64, rfn::typeof(rand), ::Type{Float64})
@ SparseArrays C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SparseArrays\src\sparsematrix.jl:1602
[4] sprand
@ C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SparseArrays\src\sparsematrix.jl:1612 [inlined]
[5] sprand(m::Int64, n::Int64, density::Float64)
@ SparseArrays C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SparseArrays\src\sparsematrix.jl:1610
[6] top-level scope
@ REPL[48]:1
Is this too big for SparseMatrixCSC? This seems consistent with varinfo()
where I do not have the copy(A)
directly stored (only Adjoint
object):
A 58.802 MiB 172696Γ1301264932900 adjoint(::SparseMatrixCSC{Float64, Int64}) with eltype Float64
Ap 58.802 MiB 1301264932900Γ172696 SparseMatrixCSC{Float64, Int64}
Regarding @Oscar_Smithβs suggestion, I will check again with Juliaβs dispatch to see if it is the method that I need. I basically used the advice in Computing sparse orthogonal projections for orthProject
. Maybe I need to revise this thread now. But to reiterate, I just need to compute this (as in (A' * A) \ Vector(A' * v)
in orthProject
) once so I dont really need the pseudoinverse. Anyway, a quick test (both dispatch to @ SuiteSparse.SPQR
):
Ap\c;
ERROR: Sparse QR factorization failed
Stacktrace:
[1] error(s::String)
@ Base .\error.jl:33
[2] _qr!(ordering::Int32, tol::Float64, econ::Int64, getCTX::Int64, A::SuiteSparse.CHOLMOD.Sparse{Float64}, Bsparse::Ptr{Nothing}, Bdense::Ptr{Nothing}, Zsparse::Ptr{Nothing}, Zdense::Ptr{Nothing}, R::Base.RefValue{Ptr{SuiteSparse.CHOLMOD.C_Sparse{Float64}}}, E::Base.RefValue{Ptr{Int64}}, H::Base.RefValue{Ptr{SuiteSparse.CHOLMOD.C_Sparse{Float64}}}, HPinv::Base.RefValue{Ptr{Int64}}, HTau::Base.RefValue{Ptr{SuiteSparse.CHOLMOD.C_Dense{Float64}}})
@ SuiteSparse.SPQR C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SuiteSparse\src\spqr.jl:74
[3] qr(A::SparseMatrixCSC{Float64, Int64}; tol::Float64, ordering::Int32)
@ SuiteSparse.SPQR C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SuiteSparse\src\spqr.jl:205
[4] qr
@ C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SuiteSparse\src\spqr.jl:198 [inlined]
[5] \(A::SparseMatrixCSC{Float64, Int64}, B::SparseVector{Float64, Int64})
@ SparseArrays C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SparseArrays\src\linalg.jl:1569
[6] top-level scope
@ REPL[27]:1
A\c;
ERROR: Sparse QR factorization failed
Stacktrace:
[1] error(s::String)
@ Base .\error.jl:33
[2] _qr!(ordering::Int32, tol::Float64, econ::Int64, getCTX::Int64, A::SuiteSparse.CHOLMOD.Sparse{Float64}, Bsparse::Ptr{Nothing}, Bdense::Ptr{Nothing}, Zsparse::Ptr{Nothing}, Zdense::Ptr{Nothing}, R::Base.RefValue{Ptr{SuiteSparse.CHOLMOD.C_Sparse{Float64}}}, E::Base.RefValue{Ptr{Int64}}, H::Base.RefValue{Ptr{SuiteSparse.CHOLMOD.C_Sparse{Float64}}}, HPinv::Base.RefValue{Ptr{Int64}}, HTau::Base.RefValue{Ptr{SuiteSparse.CHOLMOD.C_Dense{Float64}}})
@ SuiteSparse.SPQR C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SuiteSparse\src\spqr.jl:74
[3] qr(A::SparseMatrixCSC{Float64, Int64}; tol::Float64, ordering::Int32)
@ SuiteSparse.SPQR C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SuiteSparse\src\spqr.jl:205
[4] qr
@ C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SuiteSparse\src\spqr.jl:198 [inlined]
[5] \(xformA::Adjoint{Float64, SparseMatrixCSC{Float64, Int64}}, B::SparseVector{Float64, Int64})
@ SparseArrays C:\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.6\SparseArrays\src\linalg.jl:1593
[6] top-level scope
@ REPL[30]:1
Edit: full error message for left division.