I would say this is very worthwhile and probably the right way to go, but am weary of it using *
. It would be nice to have to prototyped in v1.0 using a different operator.
I actually had it left-associative at first. I think I’ll do this:
- Have both
leftmaterialize
andrightmaterialize
materialize(M::Mul) = leftmaterialize(M)
-
M*x
lowers tomaterialize(Mul(M.factors..., x))
and is slow and unadvised -
M⋆x
lowers torightmaterialize(Mul(M.factors..., x))
and is fast.
Weak Laplacians can still be written Δ = B'D'D*B
, as the construction of the operators works fine either case.
What about something like this?
julia> reduce(
(x,y)->begin
println("I am doing $y*$x");
y*x;
end,
reverse(
"abcd" # here I simulate array of matrices/vectors ;)
))
I am doing c*d
I am doing b*cd
I am doing a*bcd
"abcd"
You can also make use of https://github.com/AustinPrivett/MatrixChainMultiply.jl for an optimal materialize
as the default? This works on any matrices, and would end up doing the rightmaterialize
in an A*B*v
.
As another data point, https://github.com/Jutho/LinearMaps.jl uses lazy right to left evaluation for products of LinearMap
s.
That’s also our approach in LinearOperators.jl.
BTW, *
in Julia’s AST do not say it’s left- or right-associative:
julia> dump(:(1 * 2 * 3))
Expr
head: Symbol call
args: Array{Any}((4,))
1: Symbol *
2: Int64 1
3: Int64 2
4: Int64 3
(I found that it’s strange. Why is that?)
So it’s easy to get right-associative *
a la future-import (provided you do it before any use of *
):
julia> module RightAssociative
*(a) = Base.:*(a)
*(a, b) = Base.:*(a, b)
*(args...) = *(args[1:end-2]..., *(args[end-1], args[end]))
end
Main.RightAssociative
julia> using .RightAssociative: *
julia> Base.:*(x::Symbol, y) = Symbol("($x * $y)")
julia> :a * :b * :c
Symbol("(a * (b * c))")
where the result of the last output would be Symbol("((a * b) * c)")
without using .RightAssociative: *
.
I just guess: you could for example multiply 3 matrices without intermediate allocation? (So it looks to me rather beautiful than strange)
julia> import Base.*
julia> *(a,b,c) = a+b+c
* (generic function with 344 methods)
julia> 3*4 == 3*4*5
true
One could define function Base.:*(args::Vararg{AbstractArray,N}) where N
to choose the optimal order based on size
s (maybe dispatch on DenseArray
).
Why isn’t LinearAlgebra
doing that?
I can see the point. But you can do the same argument for any (potentially) associative binary operators like function composition. I felt puzzling because it seems this representation is used only for *
and +
AFAICT:
julia> dump(:(a + b + c))
Expr
head: Symbol call
args: Array{Any}((4,))
1: Symbol +
2: Symbol a
3: Symbol b
4: Symbol c
julia> dump(:(a - b - c))
Expr
head: Symbol call
args: Array{Any}((3,))
1: Symbol -
2: Expr
head: Symbol call
args: Array{Any}((3,))
1: Symbol -
2: Symbol a
3: Symbol b
3: Symbol c
julia> dump(:(a ∘ b ∘ c))
Expr
head: Symbol call
args: Array{Any}((3,))
1: Symbol ∘
2: Expr
head: Symbol call
args: Array{Any}((3,))
1: Symbol ∘
2: Symbol a
3: Symbol b
3: Symbol c
Looking at the parser, it’s used for ++
as well:
On second thought, (linear) operators are really “functions of vectors” (at least in their mathematical definition), so the mathematical notation ABCx is actually shorthand for (A∘B∘C)(x)
, not A*B*C*x
…