Someone opened an issue on one of the DiffEq packages to discuss a few things, and then showed me how they were using the boundary value problem solvers (which wrap pure Julia ODE solvers underneath the hood) perform parameter fitting on a model defined with quaternions (aerospace application) using automatic differentiation. Did it work? Yeah, I didn’t even realize we were that cool, but Julia always surprises you by generating cool new codes… and you get to claim you intended that to happen the whole time.

@StevenSiew Thats how I always felt about it too. Its more like math or something, when you know the rules things just work.

@ChrisRackauckas - I think that’s also how I feel about it. Not that I’ve had an abstract ah-hah like that! But, most other languages I find a lot of my time is spent changing code, or adding more abstractions to do something that seems like what it should be doing already. In Julia I’m usually like “oh I could do it this way or that way or I guess even - that way”. Its so flexible, you don’t have to force things to do things, they are just floating to be put into place.

You can literally feel when you hit wrapper code for python/C/whaterver by how much time is spent in Doc’s, and looking at git repos.

using Dates
# Find the next Friday the 13th that falls on April
tonext(today()) do d
dayofweek(d) == Fri &&
day(d) == 13 &&
month(d) == Apr
end
# 2029-04-13

struct ReverseDigitsIterator
num::Int
end
Base.eltype(x::ReverseDigitsIterator) = Int
Base.iterate(x::ReverseDigitsIterator) = reverse(divrem(x.num,10))
function Base.iterate(x::ReverseDigitsIterator, state::Int)
state==0 && return nothing
return reverse(divrem(state,10))
end
Base.IteratorSize(::Type{<:ReverseDigitsIterator}) = Base.SizeUnknown()

Version 2: More types (e.g. BigInt) & length prediction

struct ReverseDigitsIterator{T<:Integer}
num::T
end
Base.eltype(x::ReverseDigitsIterator{T}) where T = T
Base.iterate(x::ReverseDigitsIterator) = reverse(divrem(x.num,10))
function Base.iterate(x::ReverseDigitsIterator{T}, state::T) where T
state==0 && return nothing
return reverse(divrem(state,10))
end
Base.length(x::ReverseDigitsIterator) = ndigits(x.num)

Version 3: Screw it, I want arbitrary Int base aswell!

struct ReverseDigitsIterator{T<:Integer, B}
num::T
ReverseDigitsIterator(val::T, base::Int=10) where T = new{T, base}(val)
end
Base.eltype(x::ReverseDigitsIterator{T}) where T = T
Base.iterate(x::ReverseDigitsIterator{T,B}) where {T,B} = reverse(divrem(x.num,B))
function Base.iterate(x::ReverseDigitsIterator{T,B}, state::T) where {T,B}
state==0 && return nothing
return reverse(divrem(state,B))
end
Base.length(x::ReverseDigitsIterator{T,B}) where {T,B} = ndigits(x.num, base=B)

but as those timing was fluctuating in a range of a whole single ns(!!! ) I decided to implement length and also arbitrary types to test BigInt. -> Version 2
Turned out, speed was still fluctuating, but at least I could benchmark BigInt now:

At the end I found the ease of incremental coding (eventually even prototyping) worth to be showcased aswell. Also note that there is almost no change in the amount of needed code for those additional features.

Because you can write code that looks like the pseudocode in the paper and still have it turn out 20% faster than Matlab’s internal C code. Even when it’s complicated:

This example involves nested loops of arbitrary dimensionality (see all the ... in the paper figure).

Citation:
Maurer, Qi, and Raghavan (2003). A Linear Time Algorithm for Computing Exact Euclidean Distance Transforms of Binary Images in Arbitrary Dimensions. IEEE Transactions on Pattern Analysis and Machine Intelligence (2003). 25: 265-270. (DOI: 10.1109/TPAMI.2003.1177156)

You know, a lot of languages talk about being “closer to the metal” - and thats a real thing. But, being “closer to the math” is incredibly important for us scientific computing peoples.

Beautiful thing about julia is, it’s close to the metal and it’s close to the math!

I never understood the fascination of being “close to the metal”.

I want to be as far away from the metal as possible. The metal has billions of transistors (literally), complex instruction sets (that vary by machine), and various arcane structures like layers of caches. I occasionally need to think about these things, but being close all details would be a distraction.

I want to write generic code. It’s the job of the language and the compiler to figure out how it works on the metal. The less I need to know about the low-level details the happier I will be.

By using Julia, I hardly ever have any frustrations with programming. The versatility of the type system and the multiple dispatch is so advanced that I can always transform a mathematical idea into implementation. After iterating through evolving design choices, the result is a naturally fitting algebra and computational language with efficient and adaptive representation.

I seriously envy you I think that Julia reduced my frustration with programming by a factor of 10, but there is still plenty of it left.

I just finished profiling, benchmarking, and micro-optimizing a medium-sized codebase so that it can run some estimation of an economic model over the weekend. I got a 10x speedup in the end, most of it from a tracking down a noxious type instability that was only activated for AD, but I was seriously considering buying a farm (without internet) and keeping horses.