just a simple question: is there/what is the best neural networks package that implements (Restricted) Boltzmann Machines as a model? I know there are specific packages for that, but I’m looking for something more general, that allows different models (Perceptrons, Convolutional…) AND RBM’s.
Thanks a lot,
RBMs are quite different from typical neural networks - while the core of the NN packages is automatic differentiation, RBMs are usually optimized by contrastive divergence. If you indeed want that old and slow sampling-based procedure, Boltzmann.jl should still be working and provide some space for extending RBMs with layers from NN packages. Though from a practical standpoint I’d recommend to take a look at more modern alternatives with similar properties such as Variational Autoencoders (which have plenty of implementations in NN pakages).
Not quite. RBMs are very nice networks with much more physical insight than other networks, and have found plenty of use recently in Quantum Physics for instance. It’s true that they are not very fashionable in the ML world, but that’s a mistake in my honest opinion. Anyway and as I said, they are widely used in quantum systems and that’s why I ask now.
And no, I do not want to use old packages, but rather something that is integrated in a NN ecosystem.
…otherwise I can try SciKit or similar things in Python, but I was hoping for something in Julia. Maybe there is not such thing?
What kind of integration are you looking for? Can you describe the perfect package for your use case?
I don’t know exactly, but an ecosystem like Knet would be nice. And I understand the problem here is exactly automatic differentiation as you have to deal with the negative phase in the weights update rule, but I also understand that a Contrastive Divergence algorithm could also be implemented. Other interesting NN ecosystems that come to my mind are Theano, SciKit, PyTorch etc etc… which are ready to use in that sense.
There are indeed implementations of RBMs e.g. in PyTorch, but they don’t use anything NN-specific. For example, take a look at this implementation - it uses PyTorch tensors and PyTorch’s CUDA integration, but implements contrastive divergence and weight update completely on its own.
In Julia you don’t have to restrict yourself to any specific framework - you can easily combine CuArrays from CUDA.jl, CD implementation from Boltzmann.jl,
Dense layer from Knet.jl. There’s also Avalon.jl which mostly repeats PyTorch API if you decide to translate some fancy Python implementation to Julia, and BoltzmannMachines.jl with a number of pre-defined RBM implementations. It seems to be quite enough to construct whatever RBM-based model you have in mind.