For our SumProductTransformation networks https://arxiv.org/abs/2005.01297, we have created an invertible “Dense” transformations usual in neural networks. Our version features efficient inversion and efficient calculation of a determinant of a Jacobian (of course only where this operation makes sense, i.e. it is from a R^d -> R^d). Our approach (detailed in the paper) relies on a representation and optimization of a dense in an SVD decomposed form, for which we needed differentiable parametrisation of the group of Unitary matrices. We have separated this functionality to a separate repo, which is now registered and you can freely use it with Flux / Zygote. https://github.com/pevnak/Unitary.jl.

For implementation of Dense layer, see https://github.com/pevnak/SumProductTransform.jl/blob/master/src/layers/svddense.jl, whichi roughly implements `Bijectors.jl`

interface.

I would be happy if someone founds a use of this. We are currently working on a GPU version, but it will takes a bit of time. Meanwhile, reach me for possible enhancements or questions.

Tomas