[ANN] HMMBase.jl - A lightweight and efficient Hidden Markov Model abstraction


I’d like to introduce HMMBase.jl. A package providing basic building blocks for hidden Markov models in Julia.


While there are many implementations of HMMs for Julia, and other languages, I found out that:

  • Julia packages, such as HiddenMarkovModels.jl were not maintained anymore and thus not compatible with recent Julia version.
  • Most HMMs libraries tend to directly implement the algorithms for a given probability distribution, most commonly discrete and normal distributions, and hence cannot easily support new distributions.

Instead, I chose to rely on the Distribution interface provided by Distributions.jl to handle arbitrary distributions. As long as the distribution implements the pdf and fit! method, it is supported by HMMBase. Whether it is univariate or multivariate, a single distribution or a mixture model.

For example, we can sample a two states HMM with two different distributions as follows:

hmm = HMM([0.9 0.1; 0.1 0.9], [Normal(0,1), Gamma(1,1)])
z, y = rand(hmm, 1000)

The only constraint being that each observation distribution must have the same dimension (e.g., it is not possible to mix 2d and 3d normal distributions).

Similarly, arbitrary containers that conforms to the AbstractArray interface are supported to store the model parameters and the states/observations. This means that, for example, ArrayFire.jl could be used to perform some computations on the GPU. That said, I only have tested standard Julia arrays and StaticArrays for now.


My goal is to provide well-tested and efficient implementations of the following algorithms:

  • Forward-backward
  • Viterbi
  • Baum-Welch (MLE estimator)

The MLE estimator implements only the E part and relies on the fit_mle method of each distribution for the M part.

For now my (early) benchmark is against the hmmlearn Python library which implements the same algorithm in Cython (for normal distributions only). This issue shows some early results.


The package is available for Julia 1.1+ in my own registry:

pkg> registry add https://github.com/maxmouchet/JuliaRegistry.git
pkg> add HMMBase

Feel free to consult the documentation for examples.

Right now the only package that depends on HMMBase, is my implementation of the Gibbs sampler for the hierarchical Dirichlet process hidden Markov model, HDPHMM.jl. Both of these packages will be maintained during my PhD as some of my work depends on them. Hopefully, HMMBase will be stable enough by the time I graduate that it does not need much work anymore.

I’m open to feedback, ideas, and contributions :slight_smile:


This is cool, thanks for sharing! :slight_smile:

To me, it looks like HMMBase.jl implements only first-order models at the moment (i.e. the transition probabilities are conditioned on only the previous one state of history). Is this right?

The reason that I’m asking is that in natural language processing, HMMs have been used for sequence labeling tasks like part-of-speech tagging. In this task, for example, a second-order HMM that conditions on the previous two tags generally seems to do better than a first-order model that just conditions on one previous tag.

(If this would need to be a separate subtype of AbstractHMM, perhaps this is an area where I could contribute!)

Sorry for my late reply! Indeed it only implements first-order models.

I’m not very familiar with higher-order models, but right now I see two ways of implementing them:

  • Writing specialized versions of the functions (e.g. messages_forwards_2);
  • Implementing more generic algorithms (such as belief propagation) to handle models of arbitrary orders in a single function;

The second option seems cleaner but I’m worried about the performance implications. On the other hand, macros could be used to generate specialized function for an arbitrary order instead.

As for the types, I can add a new parameter like AbstractHMM{F<:VariateForm, N} where N represents the order (1 by default).

That said, feel free to draft something for second-order models and to open an issue/PR if you want. I’ll do some experiments also on my side.


Hi! Did you do end up doing any work on second-order HMMs? I have an interest in that.