Hyperparameters optimization packages for ML application in Julia

Hi all,

I am currently working with Flux.jl package for deep learning model implementation. I was wondering if some of you have had experience with some hyperparameters optimization packages (e.g. using hyperbands optimization algorithm) that can be nicely coupled with Flux.jl.

Thanks for sharing your experiences.

Simon.

1 Like

An implementation of hyperband is provided in https://github.com/baggepinnen/Hyperopt.jl#hyperband

2 Likes

Another option: https://github.com/DrChainsaw/NaiveGAflux.jl

I guess it is has some half accidental similarity to hyperband in the sense that it tries to focus training efforts on promising candidates.

If one does not believe in parameter reuse it should be possible to build hyperband on it, e.g. by using elite selection of half the population size until there is only one candidate.

3 Likes

Hi @baggepinnen,

I’d like to ask you if you could put here an example using Hyperband and inside of it is a train_network. Please, could you do this for me? :slight_smile: Thank you.

I am getting an error when trying to modify working example with RandomSample or LHSampler , but I get:

ERROR: LoadError: syntax: Global method definition around /Users/ies503/.julia/packages/Hyperopt/utzuO/src/samplers.jl:191 needs to be placed at the top level, or use "eval".
Stacktrace:
...
... 

Thanks a lot.

The macro @phyperopt works in the same way as @hyperopt but distributes all computation on available workers. The usual caveats apply, code must be loaded on all workers etc.

You make it sound so easy! I think this is VERY impressive.

Such an example would depend completely on what tools you use to train networks, as Hyperband requires you to save a meaningful state and also resume training from this state. You will find an example that shows how to do this using Optim.jl here

1 Like

Thanks :slight_smile: It’s really Julia making it easy. Distributing using Distributed is significantly easier than threading, as one does not have to worry about thread safety etc.

1 Like