Inv not smart enough?

I don’t know if the issue here is inv, SMatrix, or UpperTriangular, or if I’m just missing something.

z = [[1.0  2.0]; [0.0 3.0]]    # Matrix{Float64}
zs = SMatrix{2,2}(z);          # SMatrix{2, 2, Float64, 4}
zt = UpperTriangular(z)        # UpperTriangular{Float64, Matrix{Float64}}
zst = UpperTriangular(zs)      # UpperTriangular{Float64, SMatrix{2, 2, Float64, 4}}

z and zs are matrices just used to instantiate the rest, not otherwise used in the example, so just ignore them.

I would expect that the inverses of zt and zst are of the same types as zt and zst respectively, but this breaks for zst.

inv(zt)     # UpperTriangular{Float64, Matrix{Float64}} - expected
inv(zst)    # UpperTriangular{Float64, Matrix{Float64}} - unexpected

Curiously, this expectation works fine if you remove UpperTriangular out of the picture. For example,

inv(transpose(zst)*zst)  # SMatrix{2, 2, Float64, 4} with indices SOneTo(2)×SOneTo(2) - expected

There is no specialized method for inv(::UpperTriangular{<:Any,<:StaticMatrix}), so you just get the generic fallback. You could make a PR and add the method here.

I guess most users do not run into this because \ specializes,

julia> zst \ SVector(1, 2)
2-element SVector{2, Float64} with indices SOneTo(2):

and inv is rarely ever needed directly anyway. If anything, a specialized lu method would be more useful.


I think the usual adage goes something like “Computing an explicit inverse is bad for performance, numerical accuracy and is almost never a good way of doing things (except when you’re introducing the concept of inverses/linear maps in a course). QR or LU factorization should almost always be preferred.”


Sure, I am aware. I was doing this for both educating myself on these type of algebra tools, and Julia. Specifically I was trying to solve an over determined system. I did cholesky factorization on the first step. But even after having the matrix factorized, I still have to solve two linear systems with triangular matrices. I knew the best way of solving linear systems with triangular matrices isn’t inverting and I could implement it by hand, but I just wanted to see if it got the correct numerical result after the factorization, but because I invetstigated how to solve a triangular system in Julia (which it did). In this process I stumbled upon this surprising behavior.

So I guess this leads me to the next question: if I want to solve a straightforward linear system (NxN triangular matrix) what is the recommended way? You mentioned \ and that also gets me the right result. But this is only sensible for systems where there’s only one right way of solving them. Is this how you should use it? “the user should be smart enough to know not to use ‘’ in ambigious problems”? OR is there some better function in LinearAlgebra.jl?

btw I LOVE that in Julia “solving a linear system” has a function that comes loaded into the namespace by default. That hints at its scientific roots. Contrast this with Python, where you need to import to even calculate a square root or a log. And still no sign!!


This question is underspecified. The answer depends on

  1. what (if anything) you know about the matrix in addition to being triangular (bandedness, sparsity, any kind of special structure),
  2. how large N is,
  3. what you consider a satisfactory solution (eg tolerance for an iterative solver for large N),
  4. if you want to solve Ax = b for different b s with the same A, or “nearly” the same A (again, iterative solvers).

Lacking these details, \ is the best option.


You don’t have to implement it by hand. \ with a triangular matrix will do the right thing.

Note also that \ with an overdetermined system is defined to give the least-square solution.