Following some of the discussions on the micro benchmark thread and some of the criticism that I’ve read following the release of v1.0, I’ve put together a small package that allow to easily test Julia against R and Python for common high-level scientific programming tasks.
The idea is that someone can directly understand the impact of switching to Julia will have on performance for daily tasks, like fitting a model to some data, sampling a posterior, etc.
The pros are:
Ease of contribution; you only have to install Python and R and adding a benchmark boils down to writing a function for each language.
It runs on travis.
Potentially higher-level analysis can be quite flattering for Julia, as functionalities compose nicely.
I don’t have much time for this right now, so I won’t add benchmarks, but if people are interested, contributions are welcomed (@RoyiAvital maybe ?).
No, you need a license for matlab, it’s one of the issue with the microbenchmarks, it’s very hard to get the setup to even run them. I would maybe include Numba though.
I’ve thought about having a “macrobenchmark” suite to compare the efficiency of numerical libraries , e.g LU, QR, SVD, FFT, eigenvalues, ODE integration, etc… (and perhaps moving over a couple of the micorbenchmarks like matritx_multiplication). However the likely result is that everything runs at roughly the same speed, since most languages have fast implementations under the hood in C or Fortran.
One exception would be ODE integration, for reasons @ChrisRackauckas has described, that in Julia evaluating the f in dx/dt = f(t,x) is fast, which changes the landscape of which ODE algorithms can be efficient.
If there are more examples like ODEs it could be worth while. Otherwise, to me, it just doesn’t seem worth the effort.
For the repository of examples of Julia’s strength, I was thinking of it as Julia-only thing, highlighting computations that are easy and cool in Julia but awkward, difficult, or unheard of in other languages. E.g. fast, optimized polynomial evaluation using metaprogramming, plugging a user-defined f into an ODE integrator function and having LLVM optimize out the function calls and produce pure machine code, defining a new numeric type and getting linear algebra over that type for free, etc. Precisely the kinds of things you don’t even want to try in C, Fortran, or Matlab.