# Is there any backsolve function in JULIA (for triangual matrices)?

Dear Julia users (and developers):
We are using the QR decomposition to obtain standard deviations of our beta coefficients (in an OLS linear model) through the “standard way”:

1. Obtaining the variance-covariance matrix (note that qrf is a Julia object containing QR decomposition results and se2 is the residual variance):
bvcov = inv(qrf[:R]'qrf[:R]) * se2 # variance - covariance matrix
2. Using its diagonal:
bstd = sqrt.(diag(bvcov)) # standard deviation of beta coefficients

But it is somewhat inefficient as long as it is possible to directly obtain that diagonal.
In R, you can do this with the backsolve function included in the bdsmatrix package (see regression - How to calculate variance of least squares estimator using QR decomposition in R? - Stack Overflow)
Looking around, I was unable to find a similar package in Julia.
I would like to know if anyone could help me with this search.
Demian

https://docs.julialang.org/en/stable/stdlib/linalg/

In the link you posted, all he’s doing is calculating the inverse using just the upper triangular part of R:

``````Rinv <- backsolve(R, diag(qr.X\$rank))
``````

To do this in Julia, just

``````Rt = UpperTriangular(R)
inv(Rt)
Rt \ I # backsolve, like they did
Rt \ eye(size(R,1)) # backsolve, and allocate a matrix unnecessarily, like they did.
LAPACK.trtri!('U', `N`, R) #would modify R
``````
``````julia> using BenchmarkTools

julia> using Compat, Compat.LinearAlgebra

julia> R = randn(600,400) |> x -> x' * x |> chol;

julia> typeof(R)
UpperTriangular{Float64,Array{Float64,2}}

julia> bench = @benchmarkable LAPACK.trtri!('U', 'N', Rcopy) setup =(Rcopy = copy(\$R.data))
Benchmark(evals=1, seconds=5.0, samples=10000)

julia> run(bench)
BenchmarkTools.Trial:
memory estimate:  16 bytes
allocs estimate:  1
--------------
minimum time:     2.258 ms (0.00% GC)
median time:      2.522 ms (0.00% GC)
mean time:        2.519 ms (0.00% GC)
maximum time:     3.275 ms (0.00% GC)
--------------
samples:          1740
evals/sample:     1

julia> @benchmark LAPACK.trtri!('U', 'N', copy(\$R.data))
BenchmarkTools.Trial:
memory estimate:  1.22 MiB
allocs estimate:  3
--------------
minimum time:     2.529 ms (0.00% GC)
median time:      2.851 ms (0.00% GC)
mean time:        2.908 ms (1.43% GC)
maximum time:     5.972 ms (16.32% GC)
--------------
samples:          1716
evals/sample:     1

julia> @benchmark inv(\$R)
BenchmarkTools.Trial:
memory estimate:  1.22 MiB
allocs estimate:  4
--------------
minimum time:     4.024 ms (0.00% GC)
median time:      4.727 ms (0.00% GC)
mean time:        4.814 ms (0.83% GC)
maximum time:     7.930 ms (12.65% GC)
--------------
samples:          1037
evals/sample:     1

julia> @benchmark \$R \ eye(size(\$R,1)) #doing it exactly like in R
BenchmarkTools.Trial:
memory estimate:  2.44 MiB
allocs estimate:  5
--------------
minimum time:     4.357 ms (0.00% GC)
median time:      4.877 ms (0.00% GC)
mean time:        4.961 ms (1.64% GC)
maximum time:     8.213 ms (0.00% GC)
--------------
samples:          1007
evals/sample:     1

julia> @benchmark \$R \ I
BenchmarkTools.Trial:
memory estimate:  1.22 MiB
allocs estimate:  5
--------------
minimum time:     5.520 ms (0.00% GC)
median time:      6.219 ms (0.00% GC)
mean time:        6.230 ms (0.62% GC)
maximum time:     8.313 ms (8.66% GC)
--------------
samples:          802
evals/sample:     1

julia> @benchmark inv(\$R.data)
BenchmarkTools.Trial:
memory estimate:  2.64 MiB
allocs estimate:  13
--------------
minimum time:     9.631 ms (0.00% GC)
median time:      10.618 ms (0.00% GC)
mean time:        10.653 ms (0.80% GC)
maximum time:     16.344 ms (0.00% GC)
--------------
samples:          469
evals/sample:     1
``````

I’d just stick with `inv(R)` (making sure R is UpperTriangular) or `LAPACK.trtri!`.

EDIT:
FWIW, inv is competitive with calling LAPACK.trtri directly on 0.7:

``````julia> run(bench)
BenchmarkTools.Trial:
memory estimate:  0 bytes
allocs estimate:  0
--------------
minimum time:     1.314 ms (0.00% GC)
median time:      1.503 ms (0.00% GC)
mean time:        2.224 ms (0.00% GC)
maximum time:     19.763 ms (0.00% GC)
--------------
samples:          1897
evals/sample:     1

julia> @benchmark LAPACK.trtri!('U', 'N', copy(\$R.data))
BenchmarkTools.Trial:
memory estimate:  1.22 MiB
allocs estimate:  2
--------------
minimum time:     1.638 ms (0.00% GC)
median time:      2.046 ms (0.00% GC)
mean time:        3.261 ms (5.55% GC)
maximum time:     67.026 ms (93.90% GC)
--------------
samples:          1531
evals/sample:     1

julia> @benchmark inv(\$R)
BenchmarkTools.Trial:
memory estimate:  1.22 MiB
allocs estimate:  3
--------------
minimum time:     1.627 ms (0.00% GC)
median time:      1.978 ms (0.00% GC)
mean time:        2.399 ms (2.45% GC)
maximum time:     67.416 ms (93.32% GC)
--------------
samples:          2079
evals/sample:     1

julia> @benchmark \$R \ I
BenchmarkTools.Trial:
memory estimate:  1.22 MiB
allocs estimate:  3
--------------
minimum time:     1.981 ms (0.00% GC)
median time:      2.499 ms (0.00% GC)
mean time:        3.449 ms (4.78% GC)
maximum time:     70.179 ms (93.09% GC)
--------------
samples:          1447
evals/sample:     1

julia> @benchmark inv(\$R.data)
BenchmarkTools.Trial:
memory estimate:  1.42 MiB
allocs estimate:  6
--------------
minimum time:     7.588 ms (0.00% GC)
median time:      9.416 ms (0.00% GC)
mean time:        11.117 ms (1.98% GC)
maximum time:     78.641 ms (88.15% GC)
--------------
samples:          450
evals/sample:     1
``````

However, inv doesn’t redirect to `trtri!`. The answers are approximately, but not exactly, equal.

2 Likes

Backsolve solves: Rx = b

``````UpperTriangular{<:AbstractMatrix{<:Number}} \
AbstractVector{<:Number}
``````

Forwardsolve solves: Lx = b

``````LowerTriangular{<:AbstractMatrix{<:Number}} \
AbstractVector{<:Number}
``````

If what you want to get is the inverse of the normal matrix for OLS, what you want to use is `chol2inv `.

In Julia, `chol2inv` is given by `inv(::Cholesky)`. In order to call `chol2inv` for `QR`, you can do

``````inv(Cholesky(QR.R, 'U', 0))
``````

or pass the pivoting in case you are using `QRPivoted`.

``````using LinearAlgebra: Cholesky, qrfact, cholfact!
using BenchmarkTools: @btime
a() = rand(100,2) |> qrfact |> QR -> inv(cholfact!(QR.R'QR.R))
b() = rand(100,2) |> qrfact |> QR -> inv(Cholesky(QR.R, 'U', 0))
@btime a() # 5.916 μs (12 allocations: 4.31 KiB)
@btime b() # 5.604 μs (7 allocations: 4.00 KiB)
``````

Thank you Elrod.
I did myself the same benchmarks and I find some differences.

julia> bench = @benchmarkable Base.LinAlg.LAPACK.trtri!(‘U’, ‘N’, Rcopy) setup =(Rcopy = copy(\$R.dat
a))
Benchmark(evals=1, seconds=5.0, samples=10000)

samples: 1427
evals/sample: 1

samples: 2693
evals/sample: 1

samples: 1737
evals/sample: 1

samples: 1778
evals/sample: 1

samples: 332
evals/sample: 1