I describe a number of different ways of performing a simple linear regression in
https://github.com/dmbates/CopenhagenEcon/blob/master/jmd/03-LinearAlgebra.jmd
As shown in the README.md
file for the repository an effective way of running the .jmd file is to use Weave.convert_doc
to create a Jupyter notebook and run that. (By the way, this notebook also shows using the R ggplot2
graphics package from Julia through RCall
.).
It is difficult to benchmark a simple calculation like this but I suspect that the method of augmenting the model matrix with the response vector and using a Cholesky factorization will be competitive. It has the advantage of also providing the sum of squared residuals without needing to evaluate the residuals.