In as few lines as possible describe why you love julia

Same here. Right now, I could be happy because 50 lines of Julia code do what would have taken me 200 lines of code before I learned Julia. But since making the code run on the GPU is so easy in Julia (and gives me a 10x speedup) I want to do that. And then I end up being frustrated about AD routines that allocate more memory than my GPU has, and thinking I need to write my own GPU kernels for some operations. This wouldn’t even have been an option before! My code would run on the CPU, and if it took a week there’d be nothing I could do about it, and I’d be happy!

I think the trick might be to not have higher ambitions just because the language can do more.

11 Likes

The trick is to keep refining the foundations. What I’m willing to do is keep revisiting the foundations. If the foundational aspects are iterated and well designed, then the higher level code will work better. If one starts programming at too high of an abstraction layer without redesigning the low level foundations properly, then one might have the programming issues you describe. To mitigate that, one has to not lose sight of redesigning the foundational building blocks. It is easy to lose sight of that, with so many high level abstractions available. The more solid and refined math foundation is built from, the better and more interoperable the higher level building blocks become. However, Julia is fantastic precisely because it is possible to work on refining both low level foundations together with the higher level abstractions.

11 Likes

For me it has a lot to do with the trade-off between nice abstract user interfaces and fast code. I feel that because Julia specializes on generically written functions, even “low-level” functions deeper inside packages have kind of nice APIs and can be understood without much effort. In other languages, the further down you go, the worse it gets (especially behind the Java / C++ / C barrier for Matlab or Python.

Having fast custom structs encourages nicer APIs as well, because a Point in plotting is allowed to be a point, and a collection of points doesn’t have to be a 2D Array just because that’s how matplotlib expects it.

And I love broadcasting because it’s so much clearer than “hidden” vectorization.
So, for a short example… Maybe something like this:

a_range = 1:20
a_vector = rand(20)
a_generator = (1//x for x in LinRange(0, 1, 20)
result = a_range .* a_vector .+ a_generator

Three different iterable objects, three different data types inside them, one short broadcasting expression, and it will be pretty fast because of specialization through multiple dispatch. So easy to write and understand, too!

9 Likes

Personally, I had little experience in serious programming prior to Julia. I had used R/MATLAB/Python heavily but I never really felt I had a deeper understanding of what’s under the hood and thought all this is too hard for the uninitiated. 9 months into Julia, I can talk about and answer questions on parametric types, multiple dispatch, inplace operations and how to debug inefficiently written code (and, even have little intuitive understanding of what native code might look like for something I am coding). I think the ability for a newbie to pickup and become decent at Julia in a short span of time is an underrated aspect of the programming language. The other great aspect is that there are only a few tricks you need to master to go from beginner to being good at Julia: avoid non-constant globals, focus on inplace operations, write for-loops and try best to minimize type-instabilities. Just these few things will make your code speed up relative to most languages and you’ll begin to appreciate how good of a language Julia really is!

52 Likes

Testing code that runs at C++ speed in a REPL.

Nuff said.

6 Likes

I am a little more demanding than you, XD.

I do want to be as far from the metal as possible, except when I need to be as close to the metal as possible, and I want the same language to do both things seamlessly. Julia gets close to achieving that.

9 Likes

Being “close to the metal” is mostly an illusion on modern x86 architectures. You almost never are (in the sense of having a mapping from code, even assembly, to what is going on in the hardware), and it is rarely a useful mental model any more.

3 Likes

I :heart: the Julia community. I’m someone who just kind of found myself (quite unexpectedly) working in a data analytics capacity almost a full decade after graduating with a BS in Economics and several years after finishing an MS in Finance/Economics. My previous experience coding was limited to some moonlight web development work that I had been doing on the side for several years. Making the leap from JavaScript/Node.js to Julia was very intimidating at first, given that this community is made up of quite a few PhDs and scientific types. There were times in the beginning (about a year and 3 months ago) where I was uncertain if Julia was right for me and I was questioning whether or not I should stop trying to learn it and switch to Python where it seemed that I might fit into the community a bit better. After signing up for discourse and posting lots of questions/getting involved in the community here all of those anxieties just kind of evaporated. I’ve been truly amazed at how much time some people are willing to take to help a complete stranger and I’ve grown immensely in the past year or so as a data analyst as a result. Not only that, the help I’ve received here has literally allowed my small organization’s fledgling data analytics capacity to increase in a very tangible way.

Thanks to all of you - there are several who have posted here that have taken time to respond to questions that I’ve posted and I’m so grateful for it, every single day :slightly_smiling_face: :handshake:

39 Likes

Tbh I have a strong tendency to being nerd sniped :smiley: and I guess, that there are a few more people that tend to it aswell.
But I’m sure, from the outside I look like a nice, welcoming member of this community :grin:. In fact I even think, that tendency contributes to a nice community because it leads to a large portion of shared interest in finding the cleanest solution for a given problem. Nevertheless, most members of this community are plain nice and helpful in its own right. I like the community too and I really love the cleanliness of the code one can achieve even for complex stuff.

3 Likes

I also wanted to add a new point regarding cleanliness and design of Julia:
Many members probably remember the introduction of the current call operator definition syntax:

function (obj::SomeType)(args...)
  [...]
end

On the first sight it might seem like it tries to be some special case of function definition. But looking closer reveals that standard function definition since then is just some convenient syntactic sugar* for function-type objects:

function fname(args...)
[...]
end

#corresponds to

function fname end #defines the function object `fname`
function (fun::typeof(fname))(args...)
[...]
end

*Implementation might differ :stuck_out_tongue:

Constructors also fit into that universal syntax with being able to be written as

struct MyType end
(::Type{MyType})(args...) = ...

Thanks to this focus on overall design the language mostly feels inherently consistent and capable.

Another “feature” with a similar level of unification is the idea of Invariant Tuples. Which again, once it hopefully arrives, will turn special cases in language design (only tuples are covariant) to syntactic sugar while maintaining and even increasing the expressiveness of the type system!

4 Likes

I love that Rational

  1. is available in Base (so handy when you need it for exact calculations — for me, mainly plotting),

  2. but not built into the language: rather, it is written in Julia

  3. in a way that is so exemplary that it works fine as an demo of a lot of features (parametric types, inner constructors, validation, promotion) without major modifications.

It is but a tiny thing, but serves so many orthogonal purposes.

19 Likes

Julia seems to be able to avoid “surprises” (you know, the ones which leave you fuming after a debugging session “what were they thinking?”. Looking at you Matlab, C++, …).

7 Likes

Event-based simulation of chaotic spiking neural network of quadratic integrate-and-fire neurons including Poincaré section for visualisation:

using PyPlot,StaticArrays
function poincare()
              n,ϕ,𝚽=SMatrix{3,3}(0 .<[0 0 0;1 0 1;0 1 0]),randn(3),[] # define adjacency matrix, initialize network
              for s=1:8^7                                     # number of spikes in calculation
                  m,j=findmax(ϕ)                              # find next spiking neuron j
                  ϕ.+=π/2-m                                     # evolve phases till next spike time
                  ϕ[n[:,j]],ϕ[j]=atan.(tan.(ϕ[n[:,j]]).-1),-π/2 # update postsynaptic neurons, reset spiking neuron
                  j==1 && append!(𝚽,ϕ[2:3])                   # save neuron 2 & 3 whenever neuron 1 spikes
              end
              plot(𝚽[1:2:end],𝚽[2:2:end],".k",ms=.1);axis("off")
              end

poincareMinimal

21 Likes

Would u say the same about python? Why and why not? Just curious

Collaboration without consternation.

5 Likes

Frankly, I loved Julia for its high speed yet being a high-level language. At the early versions that I started using the language (V 0.5) it was amazing. But now I see the performance of the language is somehow fading. It makes me sad.

1 Like

Do you have the impression that the language is getting slower? That’s pretty surprising.

5 Likes

It would be great to see some specific code that demonstrates this, ideally opening an issue.

Please note that performance is tested extensively:

and reports of regressions are taken seriously.

13 Likes

I would be interested in hearing about evidence to that effect. We take regression reports very seriously, but if you don’t report regressions, then we can’t investigate them. The general trends seems quite the opposite: Julia has gotten significantly faster since 0.5. Back then it was typical that well-tuned Julia code would be somewhere between as fast as C and half as fast as C. These days, well-tuned Julia code is often faster than C, sometimes by quite a bit. It’s very common that old code has just automatically gotten faster from a combination of LLVM improvements and Julia compiler improvements.

23 Likes

My guess is that on average, julia code being run in the wild is slower, but only because more people are running poorly tuned code. Mea culpa :sweat:

2 Likes