Given a square (invertible) matrix D
and a scalar s
, the operations s/D
and D\s
are not allowed in Julia. I’d like to give some arguments both in favour and against allowing these operations. Some related discussion can be found here and here. (My interest in this question stems from our current work on a macro that allows a convenient syntax for calling inplace operations such as BLAS.trsm
.)
Although s\D
and D/s
are disallowed, all of the following can be done successfully in Julia:
using LinearAlgebra
D = randn(3,3); N = randn(3,3); s = randn()
inv(D)*s
D \ (s*N)
D \ 2N
D \ N*s
s*inv(D)
N*s/D
however, the following variants fail:
D \ s*N
D \ 2*N
because they are parsed from left to right and attempt D\s
before multiplying by N
.
As pointed out in the discussions cited above, a scalar can be divided by a vector, which results in (a multiple of) the pseudoinverse of the vector. That is, the following can be done successfully:
v = randn(3)
@assert s/v ≈ s*pinv(v)
@assert v'\s ≈ s*pinv(v')
However, the following variants fail:
v \ s
s / v'
s / randn(2,3)
even though all of the above denominators do have pseudoinverses (the pseudoinverse always exists).
My argument in favour of more generally allowing the division of scalars by matrices is therefore one of consistency.
However, there is also a good argument for the case against: In general, D\(s*N)
is both faster and more accurate than inv(D)*s*N
. If D\s
were to be implemented as s*inv(D)
, or s*pinv(D)
, then D\s*N
would end up doing the slower, inaccurate calculation. (The MATLAB editor will usually warn the programmer about attempts to multiply inverses.)
I think therefore, for fast accurate Julia, scalar divided by matrix should remain disallowed and perhaps to improve consistency, scalar / vector should be deprecated.