I am trying to solve a sparse linear system, but Julia 1.7.2 crashes (I believe it’s an OutOfMemoryError()):
infil> size(A)
(2938746, 6906)
infil> nnz(A)
17591409
infil> nnz(A)/prod(size(A))
0.0008667862253365853
infil> typeof(A)
SparseArrays.SparseMatrixCSC{Float64, Int64}
infil> typeof(b)
Vector{Float64} (alias for Array{Float64, 1})
infil> @which A \ b
\(A::SparseArrays.AbstractSparseMatrixCSC, B::AbstractVecOrMat) in SparseArrays at /Applications/Julia-1.7.app/Contents/Resources/julia/share/julia/stdlib/v1.7/SparseArrays/src/linalg.jl:1538
infil> delta = A \ b
Killed
MATLAB can solve the same linear system in less than 4 sec:
K>> tstart = tic;
x = A \ b;
toc(tstart)
Warning: Rank deficient, rank = 6267, tol = 4.718235e+01.
> In test_Ab (line 115)
Elapsed time is 3.883479 seconds.
I thought that Julia would automatically use the QR decomposition of A, so the fact that A is not full rank would not be a problem… Any suggestion to debug this?
This is the function referenced by the @which
macro:
function \(A::AbstractSparseMatrixCSC, B::AbstractVecOrMat)
require_one_based_indexing(A, B)
m, n = size(A)
if m == n
if istril(A)
if istriu(A)
return \(Diagonal(Vector(diag(A))), B)
else
return \(LowerTriangular(A), B)
end
elseif istriu(A)
return \(UpperTriangular(A), B)
end
if ishermitian(A)
return \(Hermitian(A), B)
end
return \(lu(A), B)
else
return \(qr(A), B)
end
end
It looks like because A is not a square matrix that it does indeed default to to QR decomposition. Matlab says it also defaults to QR given a rectangular sparse matrix. Maybe try calling qr()
in both languages to see what you get?
5 Likes
Seems more like a bug. Are you sure it is running out of memory? (are you seeing increased consumption?) having a way to replicate the problem the the post would be great.
Even with the QR decomposition, if the matrix is rank-deficient (i.e. if the columns are almost linearly dependent) then you have to drop some terms with some tolerance. (Google “rank-deficient least-squares”, which is what you are effectively doing.) Both Julia’s and Matlab’s QR functions drop nearly-dependent columns automatically, but the question is with what tolerance.
Julia’s A \ b
is using qr(A) \ b
as commented above, but Matlab might be using a larger drop tolerance than Julia’s default. You might try qr(A, tol=4.72e1) \ b
in Julia, passing the same tolerance that is reported by Matlab.
Though according to Tim Davis, Julia is using the same formula (and the same library) as Matlab (unless they’ve changed since 2011?):
4 Likes
How does Killed
happen? Is julia actually killed by the OS or through user intervention? Maybe there is some maximum default amount of RAM programs are allowed to use, which is set higher for Matlab than Julia?
MacOS kills the Julia kernel. I have noticed that Julia running under Linux reports more useful information in similar situations, so I will try to reproduce this problem on a Linux VM. I have also tried @stevengj’s suggestion to increase tol
but that did not work. Eventually I want to create a repo with a link to the data and file an issue with the SuiteSparse.jl package.