What package[s] are state-of-the art OR attract you to Julia, and make you stay there (not easily replicateable in e.g. Python, R, MATLAB)?

Thought it might be helpful to drop in a link to Julia Observer:

juliaobserver.com

It might help you find some packages or find out what different people are working on

7 Likes

I also find it beautiful in practice :wink:

3 Likes

Agreed :slight_smile: I think the theoretical beauty is easy to discover and well-publicised, while the practical beauty could do with more advertisement / introduction!

2 Likes

I thank Jeff, Viral, Stefan, and Prof. Edelman (and the host of other contributors) every time I think about how much of my time has been saved by programming in Julia instead of Python, C, C++! (And that’s fairly frequently)
That practical beauty is the best selling point of Julia, IMO.

2 Likes

One of the most significant things that initially drew me into Julia is that I find that there is a much smaller gap between abstract types and the purely mathematical abstractions they are supposed to represent than there is in other languages. I now know that much of this comes down to the failings of OO, although I’m not quite sure why more OO languages aren’t better at this, they certainly could try harder. For example, I don’t know of any OO language where it is quite as common a practice to inherit from some sort of AbstractArray type as it is in Julia (maybe I’ve just been looking at the wrong code). When I saw the Julia implementation of dual numbers I knew there’d be no going back.

When I talk to people about how much I love the language they tend to think that I had some sort of stake in it early on, or some involvement with the development (I have yet to ever make a single commit to the Julia repo itself), but the truth of it is that I discovered it during the process of objectively comparing alternatives to Python for doing data science. Initially I thought for sure I’d wind up using Scala, and at one point I thought that the dominance of Python in terms of code availability meant there’d be no choice other than Python+Numba(+Cython?). I’m glad I was wrong.

9 Likes

First, thanks (all of you) for the excellent answer[s]. I’ll follow up with some comments and questions, first in general, then returning to those technical domains/packages/issues.

Do (all of) you know of any down-to-earth packages e.g. for web programming or GUI building that are state-of-the-art, or at least similarly good as elsewhere? If not, do you forsee that Julia will not get there–for technical reason–or other reasons? If so we’ll keep having a two language problem, i.e. using with JavaScript rather than transpiring to it (is that really good now?). Or simply people will not think of using Julia, as mindshare is elsewhere.

Julia is not only about speed, I believe it’s a better language than most, but I’m not sure its killer advantage, i.e. multiple dispatch, really translated to better code for web programming than say Python and Django. Possibly it’s simply not needed, or worse something missing from Julia (rather than just packages)?

On one thing Julia is better than most:

None of the mainstream (and more) languages (i.e. [Objective-]C/C++, C#, Java (and most JVM such as Scala or Groovy), Python, Go, Lisp, F#, Ruby, PHP, Perl) get a full sore for Tony Hoare’s self-admitted billiond-dollar problem:

Only on the list with 5 stars, is Rust, Swift, Haskell and OCaml. And I believe Julia just missing from the list, as a 5-star language (Julia’s “Missing” or CNULL would be unfair to list as problems).

Back to technical programming and your post:

I count no less than [18 with JuliaDB, not sure if you would have it in the list or] 17 (plus other people had more suggestions, e.g. Bio.jl) packages that I believe you consider state-of-the-art, including JuMP (I believe the first such [I knew about]) and second such I heard of, your own DifferentialEquations.jl. I had a feeling there might be more, e.g. Flux and Bio.

OnlineStats.jl Turing.jl and DynamicHMC.jl Flux.jl DifferentialEquations.jl and JuMP
IterativeSolvers.jl BandedMatrices.jl BlockBandedMatrices.jl ApproxFun.j InfiniteMatrix (InfiniteArrays.jl) Distributions.jl. LightGraphs.jl MetaGraphs.jl(?) DynamicalSystems.jl QuantumOptics.jl

Also GPUArrays state-of-the-art or more GPU related? Is this similar to ArrayFire (that’s in C++ but has Julia wrapper).

Another user added Knet.jl (and more), I guess Flux and Knet can both be state-of-the-art. And I’m not wedded to Julia-only, TensorFlow.jl seems better (its API) than Google own official Python wrapper. I don’t yet know if all of these and more are competitors or maybe also complementary.

Is PyTorch redundant with better Julia packages? Or do people use it from Julia? I’m not sure it’s missing a wrapper or it just not needed (same with other ML such as CNTK). It seems however used calling Julia:

You seem to be saying using Python (and R) is ok, as long as it’s with Julia packages. Then you may as well just use Julia (possibly with the exceptional Python package with PyCall).

Elsewhere on your own package, what I’ve been quoting is “it would never have been made with C++”. I think your also saying, can I “quote” as “nor would it have been made with Python or R even in combination with C++”?

I’m not sure I understand the issue, but is this also a downside of C++, i.e. Julia better than either or a combination of both regarding this?

“[regarding packages for some domains] I would even say that Julia surpassed MATLAB/Python/R awhile ago since it’s hard to find comparable packages to ours in these other languages.” [not just for edge cases such as “where higher precision matters”?]

I googled and found banded matrix libraries for C++, Python, C# and even Visual Basic. Are you saying all those you know about, even in C++ only give you the type but are not “SCREAMING FAST” because the structure is not exploited? Can you think of more such cases, not just for linear algebra? There are more matrix types than I care to know about, and I can see sparesness (or other) properties being exploited, but not sure about elsewhere. Does multiple dispatch help with this?

You mean just for now, you don’t see any showstoppers to implementing such? Is PETSc with Julia a good substitute for now, or are you saying it’s not and Julia is only faster for single-thread in some cases?

Yeah, GPUArrays goes on the list. ArrayFire is different since it’s built around custom allocators and short kernels. GPUArrays uses Julia’s broadcasting to do kernel writing on the fly. Fusing GPU kernels means a lot since they can have pretty high overhead to call, so GPUArrays can be much faster if your arrays aren’t sufficiently large and you’re doing a lot of element-wise operations.

Yes, that’s all correct. I am saying that it would’ve been almost impossible to write in C++, and using Python+C++ or Python+Cython/Numba would’ve been hard (the abstract typing would’ve been almost impossible to replicate), and Rcpp wouldn’t’ve done it either. It’s perfectly fine to use it from those languages though, which is why I made diffeqpy and diffeqr

If you pass a Julia-defined function like I show in the README you get essentially the speed of Julia when using it from Python/R, with the weird two-language issue. Of course I would recommend just using Julia, but if someone already knows R or Python and just want to tag on some differential equation solving then it’s not a bad option and it won’t really slow them down (but you do need to define the derivative function in Julia, as shown in the release blog post).

These two are related. As I pointed out, you have things like a banded linear solve in SciPy, but it’s a separate function. So if someone writes a geometric multigrid, the writer of that function has to explicitly say that banded matrices are allowed. So most of the iterative linear algebra routines, stiff ODE solvers, etc. don’t allow for special matrix types at all (or it’s Sundials which specifically hardcodes banded matrices as an option).

In Julia, the operations are defined by the type, so it’s not the algorithm writer but the user who defines what kinds of types are able to be specialized on (if the internal algorithm just accepts the given types). So IterativeSolvers.jl allows you to use BandedMatrix types that the IterativeSolvers.jl authors have never heard of, and it will work. This is a completely different level of usability. So while BandedMatrix are the one where tryhards have hardcoded when necessary (since they are common in PDEs), you can ask: did they specialize on Toplitz? Block-Banded? Etc. The answer is almost always no (unless it was made for a specific PDE with a specific property), but any Julia generic algorithm gets to say yes for every single possible type because the actions on the matrix are controlled by the user and not the generic algorithm itself.

This can be a huge deal in PDEs. Banded matrices have a fast QR. I believe block-banded matrices get a fast factorization as well. Almost any system of PDEs can be represented with these two types, so that should be much better than a generic SparseMatrixCSC. I’m sure there’s lots of other cases where this kind of stuff comes up.

PETSc.jl is a good substitute for now, but it’s not special compared to what you’d find if you just used C++ directly. It is nice to have it in Julia with the basic linear algebra dispatches (so then it too works in IterativeSolvers.jl!) but you’re not going to find much else.

4 Likes

Many have already said, but two things made me keep using Julia since v0.2 (besides the superb features of the language itself):

  1. The amazing, active, and friendly community;
  2. The DifferentialEquations.jl package.

I do a lot of simulations that have a continuous part and a discrete one (like the actuation of the onboard control subsystem of a satellite). In past, I used Simulink, but it is not good for collaborative development. Hence, I started to use .m, which is very slow and trick to mix discrete and continuous parts.

With help of @ChrisRackauckas, I could do such simulations in Julia in a so easy and elegant way that I have never seen. This, IMHO, is a package that is not matched by any other available tool!

4 Likes

This, I remember that when I first heard about this magickry of AD during some juliacon talk I was skeptical (mostly because I actually didn’t know something like that existed) that it would work as advertised and not in some specific cases, AND that it would be fast. When I then used it once or twice, basically because I was too lazy to do the derivatives myself, I was was in disbelief until I tried some more fancy things and it just kept working perfectly. Then I looked through some of the theory behind it, and when I saw how elegantly it was reproduced in Julia it completely sealed the deal for me (I was already pretty convinced though to be fair).

Idem, sadly I don’t encounter too many Differential equations or things JuMP could help me with.

I second this statement, its a concise summary with everything needed, very nice.

As for my own experience, I’ve never before had the level of excitement that I had when I watched all these amazing packages and capabilities being presented last Juliacon (I first stumbled across Julia 3 months prior). Even packages that were in domains that I knew nothing about seemed super cool.

I must say that coming from what felt like a constant wrestling match with matplotlib, Plots.jl really is a breath of fresh air. Being able to define the recipes is very useful in my day to day workflow. The user friendliness of GPUArrays.jl,CuArrays.jl and CUDAnative.jl is also incredible, usually avoiding the necessity of writing my own cuda kernels, and making the few that I did write almost exactly the same as the functions they replaced.

6 Likes

Nemo (https://github.com/wbhart/Nemo.jl) for arbitrary precision computation is fantastic for statisticians needing to compute incomplete gamma functions, combinatorial coefficients and more generally numerically unstable likelihoods and distributions.

On another topic, https://github.com/yunjhongwu/TensorDecompositions.jl has some algorithms which are hard to find outside Matlab and the tensor Toolbox

2 Likes

Interesting application. (By the way, we moved to https://github.com/Nemocas/Nemo.jl)

IMHO I think PYGMO (python interface to C++ pagmo library) is better than JuMP at least for the use case I am interested in (space trajectory optimisation).
PYGMO handles both local and global optimisation.
JuMP seems more convenient for simple problems when you do not need to call user functions, but for real-world complex problems where your objective and constraints are implemented with results of differential equations with large (1000+) number of parameters, PYGMO looks more useful.
Now I do not know of anything in python coming close to DifferentialEquations,jl…

Yes, it is definitely better for that use case, but that’s not really the sort of thing JuMP is designed for (at least not primarily). Sounds like you are coming into it from a similar direction that I was when I started using it: JuMP is mainly used for getting problems into a standard form for cases where there is a (usually non-stochastic) method for solving that particular form. You are probably asking yourself the same question I was when I was first presented with those problems: Well, isn’t this trivial for these simple linear or quadratic, semi-definite etc problems? Well, no, and in most of these use cases you tend to have to alter the problems a lot, and typically they are much easier to formulate in a different form from how they are solved.

Anyway, PYGMO looks to me like a really great C++ library with a Python wrapper. It looks to me for instance like the entire MGA-1DSM problem was implemented in C++ already, which is great, but imagine doing it all in Python: that’s basically what Julia would offer you, having the language you set up your problem in be the same as the language you solve it in.

You absolutely can do this in JuMP, many of my use cases of it involve millions of parameters, but again, they are being used in different types of problems.

Have you ever looked into implementing MGA-1DSM in Julia? That could be a really cool project, my guess would be that Optim.jl might provide some sort of starting point for tooling, but I really have no idea what kinds of algorithms are really used for solving it. Found this description of the problem online, which is very cool as I definitely did not have much of an idea of how these sorts of astrodynamics problems are solved in practice.

1 Like

I don’t really get the comparison between JuMP and PYGMO—they do not seem to be addressing the same kind of optimization problems. PYGMO doesn’t seem to support any LP solvers at all, which is the core application area for JuMP. Am I missing something?

4 Likes

I forgot to say about Parameters.jl. AFAIK there is anything like this in MATLAB and the possibilities to greatly simplify the code are countless.

2 Likes

I was actually talking about NLP since AFAIK JuMP is also the Julia tool for NLP.

AFAIK Optim.jl only supports box constraints. For more complex constraints, you have to use objective penalties which may be fine for global optimisation methods when you start from scratch but when you are close to the solution you really want to refine it with a local solver using NLP (with constraints). PyGMO has some meta algorithms which perform a stochastic search and refines the samples using a local optimizer.

1 Like

Or transformations to boxes and/or \mathbb{R}^n. The latter is quite common in statistics.

If you have a twice differentiable objective and once differentiable nonlinear constraints we have an interior point newton solver for those problems in the version about to be tagged (of Optim.jl that is).

2 Likes

The Newton solver currently requires the constraints to be twice differentiable as well, I believe? It would be nice to add an interior point quasi-Newton algorithm to allow once differentiable objective and constraint.

As an aside, which may be why you mentioned first-order constraints: I have implemented autodiff constructors for the Jacobian of the constraints, but I’m not sure how to do the second order derivatives Hessian in an efficient way.

1 Like