I’m happy to announce HMMGradients.jl! This package makes it possible to use gradient descent optimization to learn the parameters of Hidden Markov Models (HMMs). These optimization techniques allow the use of *deep neural networks* within HMMs as generally adopted by modern speech recognition systems.

Formally, HMMGradients.jl extends ChainRulesCore.jl making it possible to train HMM models using the automatic differentiation frameworks of Julia, for example using Zygote and machine learning libraries like Flux. Numerical stable algorithms to compute forward, backward and posterior probabilities of HMMs are also provided. The documentation explains the basic concepts of HMMs and introduces the procedures for training them. This software package can be for speech recognition tasks and for other applications where HMM are useful such as alignment of bio-sequences and protein folding to cite only a few.