Any Black-Box Packages for Bayesian Hyperparameter Optimization?


Configuring hyperparameters in a machine learning problem can be a dauntingly boring and suboptimal task as each evaluation may be fairly long to conduct and thus performing an exhaustive grid search may be infeasible. Bayesian optimization provides a seemingly nice and intuitive framework for this instead of manual search and there are some packages out there in other languages. For example HyperOpt is an open source package developed in Python.

My question is this: How do people perform bayesian optimization for hyperparameter search in Julia? Is there currently a package out there that I was not able to find or do they use PyCall for example and use HyperOpt of Python instead?


I found

very nice (not for ML, but for a difficult optimization problem with a stochastic objective).



only very basic concepts of hyperparameter optimization are implemented in Julia so far (to my knowledge). There are ongoing discussions in MLJ.jl on what could be implemented.

If someone is interested in implementing some sort of BO hyperparameter optimization, then this person should have a look at BOHB: Robust and Efficient Hyperparameter Optimization at Scale which seems to outperform standard BO. There exists a python package called HpBandSter already.