Would slowing Julia's release cadence improve ecosystem quality?

Could it help improve the quality of the Julia ecosystem if Julia releases slow down their pace after the next LTS release? Are there any important features of Julia 1.11 or 1.12 that are worth getting as soon as possible?

I don’t see what the frequency of branching a new Julia has to with the quality of the ecosystem or the quality of the language itself. Actually the releases themselves are not on a strict timer. IIUC there are (somewhat) defined timepoints when a new version is branched but then it is stabilized which can take a (theoretically) arbitrary amount of time, where no new features are merged and just bugfixes are done. Only after the version has reached high enough quality it is actually released. (Meanwhile new features continue to be developed for the main branch).

Could you explain how/why a slower release frequency would help? I think it might help reduce some logisitical overhead but I don’t think that it would reduce the total amount of testing/bugfixing or generelly work needed.

3 Likes

Wouldn’t a slower release frequency give package developers more time to fix their packages by having to support fewer versions?

Julia’s minor releases aren’t breaking. If it works now it will work until Julia 2.X.

YMMV.

6 Likes

I wish it was like that! In practice, almost every Julia minor release breaks lots of popular packages, requiring updates at least somewhere in the dep tree.

1 Like

Do you have examples? This is not my experience.

The most common I’ve seen have been various changes to printing, which break the tests of a package (because the returned string is no longer exactly equal), but it doesn’t break user-code because the higher-level functions etc all still work.

3 Likes

The most recent large breakage is that earlier versions of SortingAlgorithms.jl don’t work on Julia 1.9. SortingAlgorithms package has absolutely lots of dependents, and its earlier versions just don’t load on 1.9 at all.
This is exactly breaking existing user code, the same packages + versions that worked before now don’t.

For more examples, see Forward compatibility and stability of Julia vs. Packages - #42 by aplavin (and the whole tread, btw).

1 Like

I’m not sure about Zygote in particular, because Zygote is weird and relies on compiler internals.

Outside of Zygote, no. The problem isn’t with Julia breaking packages, it’s with there being exactly $0 going into improving them. Distributions.jl is borderline unusable, despite being arguably the single most important package in the entire statistical ecosystem. There are basic correctness bugs in Distributions.jl that have been stuck there for 3 years now with no incoming fix because there’s nobody to review the PRs.

3 Likes

To be truth, not having any of these problems myself, that sounds quite strange. You could not install CUDA or LoopVectorization in a Julia version (any of them?). I use those since ever and never seen that.

(Or you mean that old versions of those packages could not be installed in newer Julia versions, well, maybe, I didn’t test that, just never had an issue with them, but maybe because package authors are quick)

Yes, exactly that. I’m not sure how to interpret “old version of a package works with Julia 1.x but doesn’t with 1.x+1” other than “Julia update breaking perfectly working code”. Note it doesn’t mean either violates semver rules.

Yes, (some) package authors have to update their code so that it works on newer Julia versions.
Surely it’s not

I guess you just don’t pin your package versions to increase reproducibility, and that’s why haven’t encountered this breakage. But it definitely exists and is very common — basically every julia release requires some updates to very popular package(s).

1 Like

This is going in circles because ya’ll are saying the same thing while saying the exact opposite thing at the same time. Let’s make this more precise. Almost no packages directly break with Julia’s minor releases. Those that do tend to be those that interact with compiler internals, specifically Zygote, Enzyme, Cassette, etc. those packages. However, there are a lot of packages which use those packages, and thus indirectly need a bump upon a new Julia minor release because the updated dependency.

Therefore, almost no packages directly break with a Julia release, but a large swath of packages get an update that makes them no longer backwards compatible in all ways. Both are true.

With that being said, the maintenance burden of a Julia update is not very large unless you’re hacking on one of the parts interacting with the compiler. But it does mean that, in its current stage, the ecosystem “has to move on” from older versions since the dependency bumps effectively lead to support gaps where the next 1.x+1 will have one set of packages while the 1.x will be on a different set of dependencies.

13 Likes

Basically yes, and I’m not even saying this breakage is objectively a bad thing. Though I personally would prefer sticking closer to

What I was surprised about is how many packages (indirectly) break! Like would one expect StatsBase stop working on a new julia version (and it did) unless its deps updated?
So, I think user awareness is lacking in this area.

The issue there is the lack of tools for improving lower bounds.

1 Like

I’m not sure how any tooling can help if a package (version) just stops working on newer julias. But would be great if something could be done!

I have already responded to you with a detailed analysis of why the exact problem you’re having is actually a problem of lack of tooling on lower bounds support. You might want to give it a second read:

From this you can see that the issue of jumping to older Julia versions is usually an issue of unmaintained lower bounds, not upper bounds. And we have no tooling for bumping lower bounds, which is why the issue persists.

1 Like

But isn’t @aplavin’s point that if you pin all versions in a Manifest, then for most non-trivial manifests your stack will include some package that has some dependency which uses Julia internals so going from 1.x to 1.x+1 with the exact same package versions will break your code? And that traditionally non-breaking as per semver would be understood to cover this case of all versions pinned?

Manifests are versioned to specific Julia versions, so using them on a different Julia version isn’t guaranteed to work. If a newer version of a package uses a feature from a newer Julia then you wouldn’t expect it to work. If a newer version of a package bumps the lower bound for the Julia version then you wouldn’t expect it to work. So assuming that manifests work between different Julia versions is doomed from the start. Constraining it to the case of 1.x to 1.x+1 is trying to find a window where something that’s not expected to ever work in general may accidentally work, and hoping that things that accidentally work actually work is not a great strategy.

I was interpreting it as the reasonable expectation of stability, which is that today I should be able to go to Julia v1.7 and ]add RandomPackage and just expect to get a working installation. Maybe not all of the dependencies are up to date, but why should I care, it should just work according to the docs and tests, right? This definition is also the user-facing definition since they only care about details if this doesn’t work: if ]add RandomPackage always seems to work then all of this talk about manifests and yadyada just doesn’t matter. Who cares about manifests if adding a package just works?

The reason why manifests have even entered the picture is because if you do go to Julia v1.7 and ]add RandomPackage, you will most likely get some errors at precompilation time. So then the answer is “well to use an old Julia version, use a manifest that has the versions fixed for how it was back in time”. The manifest shouldn’t need to be there, that’s a hack around a current issue. The reason why you cannot make an earlier Julia version easily work is because some packages (or likely just one package) forget to bump a lower bound on either Julia or some dependency, so that one dependency that got a bump in order to fix v1.8 didn’t say that it should only install on v1.8 and now your v1.7 version set won’t build. This lack of a lower bound makes us then say “okay, so we need to use manifests everywhere!” and therefore we start talking about version incompatibilities between manifests.

But it’s a complete X-Y problem. The root cause is that there was one package that didn’t properly bump lower bounds, and so now every package that relies on it won’t install on v1.7, and therefore you manually have to set package versions to make v1.7 work (which normally the easiest way to do this is via a manifest). Manually setting versions shouldn’t be the answer, that’s just a hack and any time we tell users to do something like that we should cringe.

The way to fix that is to have tooling that requires that at some point, maybe at the point of registration, you if you have a lower bound on dependencies and Julia, you need to test that those lower bounds actually work. Package supports Julia v1.6 and tests on v1.6, that’s normal and :+1: . The root cause of the DataFrames issue that I pointed out was that DataFrames’ Package.toml said that it supported SortingAlgorithms v0.3, but in reality it didn’t and no one ever tested that since the tests only ever check the ^, so SortingAlgorithms v0.3.2 at the time. If the lower bound for DataFrames had bumped to SortingAlgorithms v0.3.2, then older versions of Julia would still work just fine, you can install and read the docs and never know you had an older version of the dependency. DataFrames, if its version bounds aren’t lying, don’t care.

The problem is that it’s easy for lower bounds to be lies because there’s no tooling to test this. It’s not DataFrames but every package that has this issue. Everyone is accidentally lying about lower bounds that are supported, they rarely get bumped and dropped, and therefore package resolution can easily give you something that is within the specified bounds but doesn’t actually work.

5 Likes

What would be the pros and cons of a slower Julia release frequency?

The con is that the cutting edge AD stuff would tell people to use Nightly because the feature they needed for the performance update was added to Julia 2 years ago but a release hasn’t been made. Also the cost of updating for 2 fixes at once >>> the cost of updated one fix two separate times, so the maintenance burden would increase.

The pro would be that it would be out of sight and out of mind for longer, so you could just not care about updates for awhile.

But this is why in SciML we’ve gone to a continual release process, release on each PR. We’ve found that makes updating way simpler since if something does happen, everyone knows exactly which PR is the culprit. There’s no hunting around, no conglomeration of changes leading to 3 different errors you found at the same time. Just a nice clean “this is what changed”, and version bumps are generally pretty quick. That plus we try to downstream test as much as possible.

12 Likes

Yes, manifests are julia version-specific (although this isn’t really well-known ime), but pinning package versions for better reproducibility can be done with Project.toml alone.

I wanted not to conflate different meanings of “compatibility” and only talked about “update julia - keep all code (including package code) the same”. Sure, there are other scenarios like “a new package version should only compatible with new julia” but this is an independent story.

Again, this is a completely different meaning of compatibility, not the one I’m talking about.
This one seems easier to fix, you just need better tooling to test lower bounds. There is some, but it’s not polished nor advertised, eg:

Meanwhile, “Julia updates breaking existing package code” doesn’t seem easily fixable.

1 Like