help?> /
search: / //
[...]
A / B
Matrix right-division: A / B is equivalent to (B' \ A')' where \ is the left-division
operator. For square matrices, the result X is such that A == X*B.
See also: rdiv!.
Examples
≡≡≡≡≡≡≡≡
julia> A = Float64[1 4 5; 3 9 2]; B = Float64[1 4 2; 3 4 2; 8 7 1];
julia> X = A / B
2×3 Matrix{Float64}:
-0.65 3.75 -1.2
3.25 -2.75 1.0
julia> isapprox(A, X*B)
true
julia> isapprox(X, A*pinv(B))
true
You have a clue at the very end: the result is equivalent to multiplying by the pseudoinverse of B, which always exists. This is also what backslash does, i.e. A \ B: it finds a least-squares solution to the linear system, but that does not require B to be invertible.
Julia does use the pseudoinverse in the case in which A2 is not square. If A2 is square, it uses LU factorization of A2 . Actually to be more exactly correct, it does a conjugate transpose first and returns
copy(adjoint(adjoint(A2) \ adjoint(A1)))
The operator \ then computes an LU factorization of adjoint(A2) and uses forward and backward substitution to apply the inverse. It does this in floating point and in floating point with rounding there is no clear distinction between a singular matrix and a matrix that is close to being singular. In general, if you compute the LU factorization of a matrix A, you get not the exact LU factorization, but an LU factorization of a matrix that is close to A:
P(A+E) = LU
where E represents a backward error due to rounding. At this point you are effectively applying the inverse of a matrix A+E that is nearly but not exactly singular. This results in dividing by some very small numbers and you get the large numbers you saw in A1/A2.
If you do know that your matrix is singular, using the pseudoinverse might be more appropriate, but Julia doesn’t do that automatically for square matrices. If you want to avoid numerical errors you could do something like this with rational matrices. In that case you will definitely get a singular exception.