I’m pleased to announce the early alpha release of Flux, a Julia interface for machine learning.
Flux gives you the best of both worlds. Like Knet or PyTorch, Flux code is easy to reason about; it behaves like Julia, you get control flow, good error messages and stack traces and can even step through models with Gallium. Unlike those frameworks, Flux can still compile models to TensorFlow or MXNet in the background, meaning you don’t have to sacrifice state-of-the-art performance.
Those features – combined with intuitive mathematical syntax and first-class recurrent models to sweeten the deal – mean we hope that Flux can become a great pedagogical tool and even the best way to explore complex new architectures.
We have some way to go, but this is a solid start, and you can check out what works so far in the docs. Over the coming weeks we’ll have blog posts with more details and examples on what Flux can do. Enjoy!