I’d like to introduce HMMBase.jl. A package providing basic building blocks for hidden Markov models in Julia.
While there are many implementations of HMMs for Julia, and other languages, I found out that:
- Julia packages, such as HiddenMarkovModels.jl were not maintained anymore and thus not compatible with recent Julia version.
- Most HMMs libraries tend to directly implement the algorithms for a given probability distribution, most commonly discrete and normal distributions, and hence cannot easily support new distributions.
Instead, I chose to rely on the
Distribution interface provided by Distributions.jl to handle arbitrary distributions. As long as the distribution implements the
fit! method, it is supported by HMMBase. Whether it is univariate or multivariate, a single distribution or a mixture model.
For example, we can sample a two states HMM with two different distributions as follows:
hmm = HMM([0.9 0.1; 0.1 0.9], [Normal(0,1), Gamma(1,1)]) z, y = rand(hmm, 1000)
The only constraint being that each observation distribution must have the same dimension (e.g., it is not possible to mix 2d and 3d normal distributions).
Similarly, arbitrary containers that conforms to the
AbstractArray interface are supported to store the model parameters and the states/observations. This means that, for example, ArrayFire.jl could be used to perform some computations on the GPU. That said, I only have tested standard Julia arrays and StaticArrays for now.
My goal is to provide well-tested and efficient implementations of the following algorithms:
- Baum-Welch (MLE estimator)
The MLE estimator implements only the E part and relies on the fit! method of each distribution for the M part. It is not yet available as I’m working out some details, but I’ll release it soon.
The package is available for Julia 1.1+ in my own registry:
pkg> registry add https://github.com/maxmouchet/JuliaRegistry.git pkg> add HMMBase
Feel free to consult the documentation for examples.
Right now the only package that depends on HMMBase, is my implementation of the Gibbs sampler for the hierarchical Dirichlet process hidden Markov model, HDPHMM.jl. Both of these packages will be maintained during my PhD as some of my work depends on them. Hopefully, HMMBase will be stable enough by the time I graduate that it does not need much work anymore.
I’m open to feedback, ideas, and contributions