Benchmark for latest julia?

@ScottPJones, @John_Gibson, Do you have updated results for 0.7 Alpha?

Yes, but I donā€™t plan to publish benchmark results until 0.7 finalizes. Last time I ran them the geometric mean relative to C for julia-0.7-DEV was 1.08, a little slower than julia-0.6ā€™s 1.05. But maybe itā€™s worth rerunning and posting results here now that 0.7-alpha is out.

Yep. Iā€™m just benchmarking (string stuff) on v0.7 to see what needs to be optimized.

Can you compare to MATLAB and SciLua?

Scott, have you benchmarked portions of the C code for parse_int, or is that your assessment based on experience and reading the code? We could rename the benchmark ā€œrandom_numbersā€ . Or maybe time just the parsing of a pregenerated large array of stringified ints.

There was a whole thread about it on GitHub, 2-3 years ago

LuaJIT is really fast and the compilation time is almost negligible. How does it fit to scientific computing?

2 Likes

You can use SciLua as an extension of LuaJit for Scientific Computing.
It is shown on SciLua page it can compete with Julia on the micro bench created by Julia.

Iā€™d like to see Julia beating SciLua (LuaJit) in tests.
Julia is all about speed and I want to see more data proves it is faster than other competing languages.

I disagree here (if your definition of faster is strictly about run-time performance of some algorithm written in Julia vs. language X)

Iā€™d rather see more data showing how Julia is faster at the most important metric for many programmers:
How long does it take me to get the answers I need, from initial specification of the problem, to valid results?

Julia is many times faster than C, C++, Lua, Python, and any other language Iā€™ve used in 43+ years of programming, using that metric, for what Iā€™ve been writing the last 3 years.

[There was some ramp-up time, because back in v0.3.x, Julia lacked a number of things that I had available in other languages, such as CachƩ ObjectScript, but I was able to quickly add those capabilities, all by myself, writing only in Julia (and wrapping a couple open source libraries such as AeroSpike)]

10 Likes

While the metric you suggest is highly important it is very subjective.
It might be a results of personal habit, style, etcā€¦
I for one find MATLAB to be heaven in that regard.
Others find Python to be as elegant as a British Gentleman.

The first metric, pure speed, is both important and objective.
Moreover, it may have a real chance bring users.
Sayings, as yours, will seem to others as a subjective opinion and hence wonā€™t change minds of those who have other preferences.

3 Likes

There are other metrics that can be used as a proxy, such as LOC, along with total character count, [some languages like old-school Mumps have very few lines of code, simply because people stick a lot of code onto a single line, so low LOC is not by itself a good indicator], to make it less subjective.
I wish there were a good way of getting metrics for released bugs per lines of code for similar projects in different languages, as I feel thatā€™s a very important issue (and one that Julia might not do as well as, compared to something like Rust).

The scilua seems to offer some basic numerical functionalities and the performance is great, also the syntax looks better than Python. It has been around for a while, why it has not got popular?

Indeed SciLua looks impressive.
I have no idea why it hasnā€™t gain popularity (Lack of proper IDE?).
I wish it did as a light alternative to Python (LuaJit is really a miracle and its seems its developer, Mike Pall, is out of the chart in his skills for creating a JIT compiler).

Actually its developer, @stepelu, has joined this community.
He is the better one to answer the question.

It seems SciLua (LuaJit) sets the bar regarding speed of JIT languages and Iā€™d like Julia to pass that bar and get faster and more efficient (Actually only Julia is in the dial area of SciLua [LuaJit] so it is already impressive!).

Remark
When I write SciLua I mean in the Scientific Progrmaming snese.
Of course the heavy lifting is actualy done by LuaJit.

I was wondering if Julia can continue to improve its performance, since the other programming languages are catching up. For example, Numba is really fast and R 3.5.0 is also getting much faster than its old version. Some might argue that Numba or Cython are not using native Python, but actually it is quite easy to code in Numba or Cython for a basic Python user, and extra work needed is quite acceptable.

There is still nothing like StaticArrays.jl in Numba. Because of inlining etc it is at least 10 times faster than Numba for small arrays. Furthermore automatic differentiation just works with Julia and can give another 10 times improvement in solving differential equations.

1 Like

Thereā€™s pretty much a speed limit. But the issue is what kinds of codes can get there. Numba and Cython architectures donā€™t compose. Iā€™ve already shown that if you stick a Numba function into odeint (a wrapper over a Fortran function) you still get something slow because of the Python in the middle and the function call cost:

http://juliadiffeq.org/2018/04/30/Jupyter.html

Julia code all composes, even between other Julia code, because you donā€™t compile functions/packages/etc. separately but together when possible. Numba and Cython are like if you compiled everything as a shared library while Julia is like statically compiling everything together. This means that Numba and Cython can put out much better microbenchmarks than it can put out real-world solutions, like seen in that example above.

LuaJIT is a tracing JIT and compiles really late like Julia. Julia and LuaJIT are pretty much head to head. The Julia benchmarks show Julia slightly ahead of LuaJIT and this uses newer versions than the one posted on the SciLua page:

These are using SciLua as well, so same benchmarks and newer version of Julia. And in another test, you can see LuaJIT and Julia pretty much head-to-head in C FFI.

Funnily enough, Julia and LuaJIT beat C itself. This is because of runtime magic to magic shared library calls compile more like statically linked libraries (further reinforcing how Julia keeps an advantage over bolted-on solutions which compile things in isolation)

https://nullprogram.com/blog/2018/05/27/

LuaJIT is fine, Julia is fine. Maybe under different circumstances LuaJIT wouldā€™ve done well. But what Julia really has is the genericism. Julia has multiple dispatch and generic functions that let everything compose. All of these benchmarks are all testing the use of integers and floating point numbers because itā€™s not trivial to run those same benchmarks with dual numbers or numbers with uncertainty or numbers with units. In Julia, swapping that out is trivial though since Julia doesnā€™t treat operations on floating point numbers as special. I think this is a bigger feature than most people think, and I think Iā€™ll only be able to convince people of that when I go show them all of the ways to use it so Iā€™ve got work to do.

But yes, speed limit, in some cases other strategies can hit it, Julia consistently gets there and is easy to optimize. Oh and the creator of LuaJIT left. Julia is in a pretty good spot for many many reasons beyond the surface level.

11 Likes

As a language implementor, Iā€™ve seen how many languages end up running into a brick wall sooner or later because of fundamental issues in their design.
You can throw a ton of money and really smart people at speeding up something like JavaScript or Python, but the very design will still hold you back.
Julia actually doesnā€™t do such a great job at optimizing at the moment (or so Iā€™ve been told by some of the implementors) - itā€™s very fast already because of its architecture.
Many of the data structures used extensively can be improved a lot (such as Dict, IMO).
Although most of the focus of v0.7/v1.0 is getting the language and the API ā€œrightā€, the comparitively small amount of time spent on optimization has made v0.7 quite a bit faster in places already.

The core Julia team has said a number of times that v1.1 will be focused on optimization, and I can hardly wait to see what things theyā€™ll manage to pull off by next year.

1 Like

Really? Looking at the git repo, Mike Pall made a commit as recently has June 5th. What happened?

He stopped in like 2012. He will still commit now and then, but if you check there have been like 100 commits in the last 4 years or so. Thereā€™s a whole explanation somewhere, I think it was because he got a job.

Edit: Looks like it was more around 2016

Too bad he canā€™t be dragged into the Julia community!