@lobingera,
Yea, I wrote about it before.
They greatly improve their JIT Engine and I think they improve fast.
I’d say one of the reasons is Julia itself.
I wonder how they compare in regular stuff these days.
Moreover most of their stuff is written in MATLAB code as well (Toolboxes and many other basic functions, see expm()
and sqrtm()
).
It’s not like all is in C / Fortran.
On top of that, as I user, I don’t really care.
90% of the things I code are derived purely from Math (Mostly Linear Algebra).
I care how fast it runs. I won’t give Julia extra points because it is written in Julia itself.
Also if Juila is 100x times faster in recursion yet slower in array work I’d still take MATLAB because I don’t do General Programming, I mostly do Scientific Programming.
The nice thing about MATLAB, for me as an engineer, it it behavior is easier to predict (For instant, I’m not sure I like their implicit broadcasting added in R2016b, I’d like it to come with a decorator) and very similar to Linear Algebra.
It might be a thing of an habit, but I did my first steps with Julia and it gave me some hard time on some things.
For instance, in MATLAB [ content ]
is preserved for homogeneous types.
In Julia it acts like MATLAB’s cells. I wish Julia would take MATLAB’s approach on that and use { content }
for in homogeneous stuff.
Moreover, why the differentiation between 1x1 array to a scalar?
I read a single number from CSV and had to do 4 operations to make it viable in the zero
constructor.
Again, maybe just struggles with old habits, but it feels the learning curve in Julia will be higher than I had in MATLAB (Which felt clean like writing Linear Algebra on paper)…
The way I see it, to bring the industry of Signal / Image / Data processing Julia must be on par on the basic stuff.
I don’t see in Addition / Multiplication and the other stuff operating on Linear Algebra objects as “Toolbox”.
In our days this is the basic stuff and Julia must be on par regarding that.
Then it will have the argument, “Julia does all the regular stuff as fast as [Complete Your Choice] and in addition we’re better on all the rest”. And hoping the rest is good enough to make people think it worth learning something new.
Anyhow, I will add 2-3 (K-Means, OMP and maybe Reweighted Least Squares) real world algorithms to measure performance. They include loops and other logic that should make “Julia” though as most algorithms in those fields the main bottleneck here is working with Matrices.
All of that said… I’m still heavily surprised there’s that much of a difference in the linear algebra, and think that most of it will go away when the problems noted above are fixed (mostly: do not time in the global scope, time twice (to get rid of compilation), and maybe switch to MKL). I wouldn’t expect Julia to be faster, just very close, since again, they are likely just calling the same C/FORTRAN BLAS/LAPACK library in this case.
Each algorithm is executed 5 times and the median time is taken (It was like that from the first test).
Moreover, all is done within a function (Namely you do include
to a script which includes files where the functions are, I still have hard time with the scope of Julia, is that OK) so besides Julia + MKL all is done as you wanted.