Bullet points:
- it’s incredibly easy to write research soft-ware.
- That research soft-ware runs quickly so time isn’t lost rewriting for production.
- Distributed computing is cake with JuliaDB(and other things).
- Flux is a world class neural network package in my eyes. The most flexible yet simple to use library I’ve used…
- Can easily do basically anything mathematical on a GPU without a headache
- Also, what other language can NeuralODE’s happen so easily? DiffEqFlux.jl – A Julia Library for Neural Differential Equations
Personal Examples:
Although this paper I’m going to show you isn’t cutting edge amazing machine learning I did do all of the work for it in Julia: https://arxiv.org/pdf/1907.11129.pdf .
GitHub - caseykneale/ChemometricsTools.jl: A collection of tools for chemometrics and machine learning written in Julia. - I wrote the heart and soul of this library (kind of like a mini-sklearn) in like 2 months in my free time with minimal effort. Look how easy it is to use:
-
https://github.com/caseykneale/ChemometricsTools.jl/blob/master/shootouts/ClassificationShootout.jl
-
https://github.com/caseykneale/ChemometricsTools.jl/blob/master/shootouts/RegressionShootout.jl
And that was written by a nonprofessional soft-ware developer (IE: me).