In as few lines as possible describe why you love julia

In as few lines of coherent code as possible show why you love Julia :slight_smile:

I’ll kick this off with a silly example:

message = Dict( [1,2,3,4] .=> ["why", "is", "julia", "great?"])
result = join(map(x -> message[x], [ 3, 2, 4 ] ), " ")

This is why I love Julia. I just write down the stuff that occurs in my head and it just works!



hypo["test negative AND notpreg"] = Pr["negative|notpreg"] * prior["notpreg"]
hypo["test positive AND notpreg"] = Pr["positive|notpreg"] * prior["notpreg"]
hypo["test negative AND preg"] = Pr["negative|preg"] * prior["preg"]
hypo["test positive AND preg"] = Pr["positive|preg"] * prior["preg"]

Cost["Do nothing|notpreg"] = 0.0
Cost["Do abortion|notpreg"] = 1000.0
Cost["Do nothing|preg"] = 12000.0 * 18.0
Cost["Do abortion|preg"] = 2000.0

Action["Do nothing|Test Negative"]= Cost["Do nothing|notpreg"] * hypo["test negative AND notpreg"] + Cost["Do nothing|preg"] * hypo["test negative AND preg"]
Action["Do abortion|Test Negative"]= Cost["Do abortion|notpreg"] * hypo["test negative AND notpreg"] + Cost["Do abortion|preg"] * hypo["test negative AND preg"]
Action["Do nothing|Test Positive"]= Cost["Do nothing|notpreg"] * hypo["test positive AND notpreg"] + Cost["Do nothing|preg"] * hypo["test positive AND preg"]
Action["Do abortion|Test Positive"]= Cost["Do abortion|notpreg"] * hypo["test positive AND notpreg"] + Cost["Do abortion|preg"] * hypo["test positive AND preg"]

for kv in Action
   println(kv[1],"    ",kv[2])


Flexible as Python, easy as Matlab, fast as Fortran, deep as Lisp.

EDIT: now I see you asked for code…


After reading your succinct solution I am inclined to believe you did write Julia code :). No code neccessary!


Someone opened an issue on one of the DiffEq packages to discuss a few things, and then showed me how they were using the boundary value problem solvers (which wrap pure Julia ODE solvers underneath the hood) perform parameter fitting on a model defined with quaternions (aerospace application) using automatic differentiation. Did it work? Yeah, I didn’t even realize we were that cool, but Julia always surprises you by generating cool new codes… and you get to claim you intended that to happen the whole time. :slight_smile:


When I wrote PDFP (basically decimal floating point instead of binary floating point) in Mathematica, it looked like this when I used it

D = PDFP_add[PDFP[1],PDFP_mul[PDFP[2],PDFP[3]]]

I convert it to Julia and it look like this when I used it

using PDFPs

D = PDFP(1) + PDFP(2) * PDFP(3)

@StevenSiew Thats how I always felt about it too. Its more like math or something, when you know the rules things just work.

@ChrisRackauckas - I think that’s also how I feel about it. Not that I’ve had an abstract ah-hah like that! But, most other languages I find a lot of my time is spent changing code, or adding more abstractions to do something that seems like what it should be doing already. In Julia I’m usually like “oh I could do it this way or that way or I guess even - that way”. Its so flexible, you don’t have to force things to do things, they are just floating to be put into place.

You can literally feel when you hit wrapper code for python/C/whaterver by how much time is spent in Doc’s, and looking at git repos.


And matrix operations works too!

Without me even telling julia how to work with PDFP
I just tell julia that PDFP is a subset of Real

julia> using PDFPs
[ Info: Precompiling PDFPs [top-level]

julia> PDFP_setShowStyle(:tiny)

julia> inv([ PDFP(1) PDFP(2); PDFP(3) PDFP(4) ])
2×2 Array{PDFP,2}:
 -2.000   1.000 
  1.500  -0.5000

This just came to my mind:

using Dates

# Find the next Friday the 13th that falls on April
tonext(today()) do d
   dayofweek(d) == Fri &&
   day(d) == 13 &&
   month(d) == Apr
# 2029-04-13

Python’s datetime is horrible to use…


@Sijun I’m sure it was an honest mistake, but I believe your first sentence violates the community guidelines. Kindly edit your post.


Oh I see what you mean.

1 Like

I code


Version 1: Just need that running

struct ReverseDigitsIterator
Base.eltype(x::ReverseDigitsIterator) = Int
Base.iterate(x::ReverseDigitsIterator) = reverse(divrem(x.num,10))
function Base.iterate(x::ReverseDigitsIterator, state::Int)
    state==0 && return nothing
    return reverse(divrem(state,10))
Base.IteratorSize(::Type{<:ReverseDigitsIterator}) = Base.SizeUnknown()

Version 2: More types (e.g. BigInt) & length prediction

struct ReverseDigitsIterator{T<:Integer}
Base.eltype(x::ReverseDigitsIterator{T}) where T = T
Base.iterate(x::ReverseDigitsIterator) = reverse(divrem(x.num,10))
function Base.iterate(x::ReverseDigitsIterator{T}, state::T) where T
    state==0 && return nothing
    return reverse(divrem(state,10))
Base.length(x::ReverseDigitsIterator) = ndigits(x.num)

Version 3: Screw it, I want arbitrary Int base aswell!

struct ReverseDigitsIterator{T<:Integer, B}
    ReverseDigitsIterator(val::T, base::Int=10) where T = new{T, base}(val)
Base.eltype(x::ReverseDigitsIterator{T}) where T = T
Base.iterate(x::ReverseDigitsIterator{T,B}) where {T,B} = reverse(divrem(x.num,B))
function Base.iterate(x::ReverseDigitsIterator{T,B}, state::T) where {T,B}
    state==0 && return nothing
    return reverse(divrem(state,B))
Base.length(x::ReverseDigitsIterator{T,B}) where {T,B} = ndigits(x.num, base=B)

rapus95 , it took me a while to figure out what you are trying to say

You are saying, it is very easy to go from a function of a very specific type to coding up a function of a very generic arbitrary type


Originally I only wanted to post version 1 to show the elegant layout of the iteration interface aswell as its shortness and efficiency:

julia> reverse_digits_fast(x) = reduce((old,new)->old*10+new,ReverseDigitsIterator(x))
julia> @btime reverse_digits_fast(-23784395623)
  20.942 ns (0 allocations: 0 bytes)

but as those timing was fluctuating in a range of a whole single ns(!!! :stuck_out_tongue:) I decided to implement length and also arbitrary types to test BigInt. -> Version 2
Turned out, speed was still fluctuating, but at least I could benchmark BigInt now:

julia> @btime reverse_digits_fast(big"-2378345987256434395623")
  15.899 μs (458 allocations: 7.84 KiB)

and just out of simple opportunity I added the 3rd version.
Benchmarking sadly got a bit worse :confused:

reverse_digits_fast(x, base=10) = reduce((old,new)->old*base+new,ReverseDigitsIterator(x, base))
julia> @btime reverse_digits_fast(-23784395623)
  21.664 ns (0 allocations: 0 bytes)

julia> @btime reverse_digits_fast(big"-2378345987256434395623")
  16.599 μs (460 allocations: 7.87 KiB)

At the end I found the ease of incremental coding (eventually even prototyping) worth to be showcased aswell. Also note that there is almost no change in the amount of needed code for those additional features.


Because you can write code that looks like the pseudocode in the paper and still have it turn out 20% faster than Matlab’s internal C code. Even when it’s complicated:

This example involves nested loops of arbitrary dimensionality (see all the ... in the paper figure).

Maurer, Qi, and Raghavan (2003). A Linear Time Algorithm for Computing Exact Euclidean Distance Transforms of Binary Images in Arbitrary Dimensions. IEEE Transactions on Pattern Analysis and Machine Intelligence (2003). 25: 265-270. (DOI: 10.1109/TPAMI.2003.1177156)


You know, a lot of languages talk about being “closer to the metal” - and thats a real thing. But, being “closer to the math” is incredibly important for us scientific computing peoples.

Beautiful thing about julia is, it’s close to the metal and it’s close to the math!


I never understood the fascination of being “close to the metal”.

I want to be as far away from the metal as possible. The metal has billions of transistors (literally), complex instruction sets (that vary by machine), and various arcane structures like layers of caches. I occasionally need to think about these things, but being close all details would be a distraction.

I want to write generic code. It’s the job of the language and the compiler to figure out how it works on the metal. The less I need to know about the low-level details the happier I will be.

Julia helps me with this.


By using Julia, I hardly ever have any frustrations with programming. The versatility of the type system and the multiple dispatch is so advanced that I can always transform a mathematical idea into implementation. After iterating through evolving design choices, the result is a naturally fitting algebra and computational language with efficient and adaptive representation.


I seriously envy you :wink: I think that Julia reduced my frustration with programming by a factor of 10, but there is still plenty of it left.

I just finished profiling, benchmarking, and micro-optimizing a medium-sized codebase so that it can run some estimation of an economic model over the weekend. I got a 10x speedup in the end, most of it from a tracking down a noxious type instability that was only activated for AD, but I was seriously considering buying a farm (without internet) and keeping horses.