Hello everyone,
I recently wrapped up some code of mine and decided for my own sake to package it and register it with general Julia registry.
The package can be found here:
The package is about approximating an intractable (unnormalised) posterior with a Gaussian distribution. The package implements variational inference using the re-parametrisation trick. As far as I know, there is at least one more package that does the same job called AdvancedVI.jl.
What is different in this package is that, as opposed to drawing new samples from the approximate posterior in order to evaluate the intractable expected lower bound (ELBO) at each iteration of the optimiser, the samples are generated once in the first iteration and kept fixed throughout the algorithm.
This has the advantage of working with non-fluctuating gradient which allows the use of optimisers such as LBFGS as opposed to stochastic gradient descent and therefore converges ‘reliably’. The disadvantage is however that the implemented algorithm does not scale well with increasing number of parameters (AdvancedVI.jl does well in this respect). Therefore, it is recommended to use the package only on problems with a relatively low number of parameters e.g. 2-20.
The work was independently developed and published here, (Arxiv link). Of course, the method has been widely popularised by the works Doubly Stochastic Variational Bayes for non-Conjugate Inference and Auto-Encoding Variational Bayes. The method seems to have appeared earlier in Fixed-Form Variational Posterior Approximation through Stochastic Linear Regression and again later in A comparison of variational approximations for fast inference in mixed logit models and perhaps in other publications too…
Any comments are welcome.
Best, nikos