Comparison tables for various Julia packages

Context

Julia enthusiasts are always pitching the language to colleagues, and despite various technical arguments, we often fail to show how the language can concretely impact scientist’s productivity.

Given this context, I am starting this thread to collect concrete evidence that the language enables state-of-the-art package ecosystems.

Goal

The ultimate goal is to collect comparison tables between Julia package ecosystems and alternatives written in other languages. These tables can be worth 100 pitch talks if potential users have specific software demands.

Instructions

  • Please add a single table per comment
  • Paste the table/figure directiy for posterity
  • Add a source link where the (updated) table is available

Aftermath

We plan to add a link to this thread in the Julia manifesto if the number of tables is representative of the Julia experience across various scientific fields:

I will start adding tables I am aware of in the comments below.

Thanks in advance for the help.

cc: @Datseris

5 Likes

GeoStats.jl

Source: Home · GeoStats.jl

3 Likes

RxInfer.jl

Source: RxInfer.jl vs. Others · RxInfer.jl

I made something similar for DelaunayTriangulation.jl here GitHub - JuliaGeometry/DelaunayTriangulation.jl: DelaunayTriangulation.jl: A Julia package for Delaunay triangulations and Voronoi tessellations in the plane, namely (need to fix the X’s for the GPL licenses though, thanks @tecosaur)

DifferentialEquations.jl


(note: its from chris Rackauckas’ blogpost from 2017)

edit: to keep this thread as just a collection of examples, let’s move any debate about the merits of comparison tables to a separate one (my bad).

3 Likes

I don’t think it’s possible to be unbiased. In fact, it’s a very good reason to implement new software because previous approaches didn’t provide the entirety of what the developers believed were good features. It’s possible for other people to look at the same things as unnecessary or even bad features. I think the way to go is to make lists that are in touch with the field as much as possible but acknowledge it’s ultimately the developers’ opinion on a good toolset. That shouldn’t be too difficult, most READMEs state the intent of the library, so the table can just reflect that. I’d argue that the Julia package’s checklist should always go first so it’s a clearer implication that the list is the developers’ opinion, and it’s easier to see the to-do list of the package when we don’t care about other libraries.

3 Likes

If you are considering replying to the “universal human bias issue” raised by @adienes above, please try to not derail the goal of this thread, which is to collect tables. People can judge biases by themselves.

I hope this won’t become yet another debate on Discourse with endless back and forth of what is right or wrong…

I do see those a bit critical, because usually the author of a table is the one who wants to present their own package best, but that is for the detailed thread probably :wink:

We gave that a try in comparing manifold features and available manifolds in our paper on Manifolds.jl, but since external links were said to be unwished for, here are screenshots, but you can also get the TeX code from arxiv

Manifold features

Note that the last version of the preprint is one-element-off; Riemannian Hessians were only available when the Galleys arrived and we did not update the arXiv

3 Likes

Manifolds

This is a bit longish, since we tried to be complete and collect all manifolds in all packages we could find

3 Likes

I am confused why the OP asked for this. I would say having the source of the tables is a pre-requisite for posting these anywhere. Please do provide the source for the tables for future reference and for robustness.

One table comes from ComplexityMeasures.jl, from our paper (which just got accepted in PLOS ONE, now in the proofreading stage). The table is on arxiv ([2406.05011] ComplexityMeasures.jl: scalable software to unify and accelerate entropy and complexity timeseries analysis ) and the copy/paste is:

2 Likes

I didn’t ask for hiding the source. My instructions aren’t clear enough maybe ?

Another table from our work on Agents.jl, which was published in SIMULATION two years ago. The problem is, this table is now definitely outdated, Agents.jl has even more features that you won’t find anywhere else. On the other hand, the competing software have improved their performance. We keep track of the latest performance trends in GitHub - JuliaDynamics/ABMFrameworksComparison: Benchmarks and comparisons of leading ABM frameworks . The paper the table comes from is in https://journals.sagepub.com/doi/10.1177/00375497211068820

And the table paste is:

But yeah, I would take this with a grain of salt because it is outdated. I think Mesa is now in v2. Not sure how much genuinely new stuff this brought in to be honest though.

EDIT: Forgot the second part of the table that showed the performance comparison:

1 Like

Replaced the 3 instructions above by the following for improved clarity:

  • Please add a single table per comment
  • Paste the table/figure directly for posterity
  • Add a source link where the (updated) table is available
1 Like

Here’s the symbolic regression feature table for SymbolicRegression.jl (listed under “PySR” which is the more well-known Python frontend). This is from our paper.

Heads up: The paper is from May 2023, so the landscape (and our package!) has evolved. For instance, PySR/SymbolicRegression.jl now can handle high-dimensional inputs and plug into differential equations – features not fully reflected in this older snapshot. In retrospect I might also add a couple other rows which PySR/SymbolicRegression.jl does not yet have, like arbitrary arity operators, or being able to evolve a Turing complete DSL.

Regarding the table itself: you might notice PySR/SymbolicRegression.jl has a :white_check_mark: in nearly every row. While skepticism is understandable for a scenario like this, here, this broad feature coverage was the main goal of creating the package, so it is expected. Many of the other excellent tools listed are geared more towards exploring specific research ideas rather than being comprehensive software products. i.e., this isn’t a statement on their inherent value, but rather reflects where development efforts were focused.

(And of course, we found Julia’s ecosystem made achieving this breadth of features significantly more manageable than, say, if we had started with C++! So it still can safely be used as a statement about ease of Julia development)

4 Likes