Is there a package with a LASSO/LARS algorithm? (I earlier used Lasso.jl, but it downgrades a bunch packages, so I am looking for something else.)

# Lasso/lars

**ianfiske**#3

Try https://github.com/JuliaStats/GLMNet.jl. Looks to be more recently updated.

This is a generalization of LARS/LASSO that makes a trade-off between L2 and L1 penalties. Set the alpha parameter to 1 to have a fully LASSO solution (see https://web.stanford.edu/~hastie/glmnet/glmnet_alpha.html for some more details on the parameters).

**andreasnoack**#4

Notice that there is an open PR to fix Lasso.jl so it will probably work again soon.

**AsafManela**#5

You can try my fork of Lasso.jl, which should work with v0.6.x until the PR is pulled.

**RoyiAvital**#6

If you’re into just solving the problem:

$$ \arg \min_{x} \frac{1}{2} {\left| A x - b \right|}*{2}^{2} + \lambda {\left| x \right|}*{1} $$

Then it would take only few lines to get it done in Julia (It will be a modern and fast solution).

P. S.

Is there a way to enable the MathJaX Plug In for this site?

**RoyiAvital**#8

I created a project on GitHub called - L1 Regularized Least Squares (Nothing fancy, some known modern method to deal with this problem, As expected, Iteration Wise the Coordinate Descent method was the fastest).

The implementation is in MATLAB but it will be easy to adapt it to Julia.

Once Julia 1.x.xxx is out I intent to put few of thosw into the MATLAB vs. Julia test.

**TsurHerman**#9

intersting … why does your objective function starts at zero and goes down to around -100 ?

shouldn’t it go down to zero?

**Paul_Soderlind**#11

Hi,

and thanks. Works very well. May I ask if you plan to eventually update to 0.7 syntax?

By the way, glmnet.jl also worked well, but the msys2/gfortran was a 3GB installation (Win64).