Possible bug with \ operator

I’m using the \ operator to solve the linear system Ax=b in the general case, even when A is singular. This invariably throws up the SingularException(2). But in the documentation, it is clearly stated that it should give the minimum-norm least-squares solution.
In case this is a bug, what would be the best way to report it?

using LinearAlgebra

julia> ones(3,2) \ ones(3)
2-element Vector{Float64}:
 0.5
 0.4999999999999999

julia> ones(3,3) \ ones(3)
ERROR: SingularException(2)

julia> ones(3,4) \ ones(3)
4-element Vector{Float64}:
 0.2500000000000001
 0.25
 0.25
 0.25

The issue here is that the LU decomposition used by Julia (which is just a call to BLAS), which is the default for square matrices, cannot handle singular matrices. The QR decomposition (with pivoting), which is the default for rectangular matrices, can support singular matrices. It’s not so pretty, but a workaround (until a more robust handling is implemented) is to specifically invoke QR with pivoting

julia> qr(ones(3,3),ColumnNorm()) \ ones(3)
3-element Vector{Float64}:
 0.33333333333333337
 0.33333333333333337
 0.33333333333333337

I have opened an issue on github.

By the way, sometimes it’s useful to get an error on a singular system, rather than plow ahead with an approximate “solution”. It may be decided that the current singular behavior is a preferable default, in which case no action will be taken on this. Note that the manual qr invocation will still function if one insists on a solution to square systems. As it is, the current behavior is consistent with the documentation of \:

  \(A, B)

  Matrix division using a polyalgorithm. For input matrices A and B, the result X is such that A*X == B when A is square. The solver that is used depends upon the structure
  of A. If A is upper or lower triangular (or diagonal), no factorization of A is required and the system is solved with either forward or backward substitution. For
  non-triangular square matrices, an LU factorization is used.

  For rectangular A the result is the minimum-norm least squares solution computed by a pivoted QR factorization of A and a rank estimate of A based on the R factor.

It’s only for rectangular systems that an approximate result is guaranteed if a fully-correct solution is impossible.

1 Like

Thank you, this is really helpful.