LASSO.jl, GLMNet.jl - hyperparameters

I’m not seeing any documentation for such, but in LASSO.jl and GLMNet.jl, I’m looking to do the following:

  1. Specify a validation set to train hyperparameters on, rather than just using cross-validation. I know CV is better, but I’m looking to do this for academic reasons.
  2. Train, either with CV or with a specified validation set, the alpha hyperparameter for ElasticNet.

How would I do either of these?

1 Like

Have you looked at
https://github.com/alan-turing-institute/MLJ.jl
?

I have not, in depth, for this use case. MLJ has always appeared pretty daunting to me.

1 Like

Not gonna lie I felt the same way when I saw it, and haven’t thought of using it since… These are pretty elementary usage questions that I think the package maintainer or docs should be able to answer. Have you tried filing an issue?

My package has some lightweight classification/regression statistics if you need them (GitHub - caseykneale/ChemometricsTools.jl: A collection of tools for chemometrics and machine learning written in Julia.) and also some sampling tools(random splits, kennard stone, cv, venetian blinds, etc). No LASSO, I don’t think? but I could do a really quick and dirty implementation in like 20 minutes if you’re desperate. It wouldn’t be world class though…

Alright so GLMNet has zero documentation - classy.

But as far as Lasso.jl goes, I think this page will help: Home · Lasso.jl

I pored through all of the documentation for both, prior to posting this. From what I inferred after, though, GLMNet’s fortran code doesn’t support hyperparameter tuning using a validation set.

I’m revisiting MLJ again, though, and it looks a lot more advanced than when I had first looked at it. Bringing in the scikitlearn.jl models was awesome.

Although GLMNet.jl has minimum explanations, Extensive documentation is available for its Python, R and Matlab versions.

As far as GLMNet.jl in Julia (which is just a wrapper) goes, Python documentation should be a good place to start. The rest should be similar.