Julia defines not only a Tridiagonal type but also a SymTridiagonal type.
Why is this type needed compared to Symmetric(Tridiagonal)
, that could still allow function specialization?
Possibly memory savings? Symmetric{<:Any, <:Tridiagonal}
stores a Tridiagonal
, which stores both the first superdiagonal and the first subdiagonal, while SymTridiagonal
only stores one of those. I think very large SymTridiagonal
s are often used, so memory limitations could be a concern.
I agree with @matthieu
I think the type SymTridiagonal
is to some extent redundant.
Because the combinations are abundant.
If we have SymTridiagonal
, why shouldn’t we also have SymSparse
etc.?
If we want to save memory, we could simply piggyback on Bidiagonal
, i.e.
julia> dv = [1, 2, 3, 4];
julia> ev = [7, 8, 9];
julia> Bu = Bidiagonal(dv, ev, :U)
4×4 Bidiagonal{Int64, Vector{Int64}}:
1 7 ⋅ ⋅
⋅ 2 8 ⋅
⋅ ⋅ 3 9
⋅ ⋅ ⋅ 4
julia> STD = Symmetric(Bu)
4×4 Symmetric{Int64, Bidiagonal{Int64, Vector{Int64}}}:
1 7 ⋅ ⋅
7 2 8 ⋅
⋅ 8 3 9
⋅ ⋅ 9 4
in which way also brings the desired Symmetric Tridiagonal matrix.
Some necropost!
Yes, but it’s hard to make sure the infinitude of combinations all dispatch to the most efficient implementations of every algorithm. LAPACK has especially many specialized routines for real symmetric tridiagonal matrices, because reduction to that structure is an intermediate step in almost anything that involves Hermitian or real symmetric matrices. Hence it’s convenient to have a type for specifically representing such matrices, rather than having to use unintuitive wrapper combinations like Symmetric{T,Bidiagonal{T}}
and hope that someone added all the necessary overloads to keep you on the fast path.
There are extremely specialized algorithms to work with SymTridiagonal
matrices, e.g. LAPACK provides specialized eigensolvers for this case, and specialized routines to transform any Hermitian matrix into real SymTridiagonal
form (accessible in Julia via e.g. hessenberg(Hermitian(...))
— it’s a special case of Hessenberg factorization. Such matrices have a very important role in numerical linear algebra.
I do think that if we were rebuilding LinearAlgebra from scratch today, we would just have Banded{lo, hi}
and HermitianBanded{hi}
where Diagonal
would be Banded{0, 0}
etc.