Would slowing Julia's release cadence improve ecosystem quality?

No, every new version intruduces new features, so 1.9 code is not guaranteed to work on 1.8, but only on 1.10, 1.11 etc.

6 Likes

I feel like the issue here is the use of compiler internals. Theyā€™re called internals for a reason, and thatā€™s a big reason Zygote is a broken, buggy mess. Either we need to create a clear, public interface that wonā€™t break from update to update and is enough for Enzyme/Zygote/Diffractor to rely on (is that possible?); or we take the Python approach of ā€œlet the compiler leak everywhere and freeze it in placeā€; or we just admit these packages were a bad idea and cut our losses. We donā€™t have the big tech money weā€™d need to keep making substantial fixes to these packages with every minor update. If nothing else, we canā€™t accept a bus factor that smallā€“I think weā€™ve seen exactly that situation with Zygote having nobody to work on it after Mike Innes left for Relational AI.

True! I certainly overstated there.

Yeah, thatā€™s probably true for some of the specific cases here. But it still likely caused a compatibility issue for many folks trying to setup environments in python 3.5, and I doubt anyone called out the python developers for that.

I do still feel like a lot these are package issues, not language issues, and the two are being conflated.

1 Like

How long would it take Julia to include all the necessary mechanics for those libraries?

No, your package would not break, because you would then in your pyproject.toml update the Requires-Python entry to Requires-Python: >=3.6.
Letā€™s say you did this in your package version N, and in N-1 it was >=3.0.

People in an environment with a python 3.5 would only be offered up to N-1 for installation. People with newer Python would have up to N available.
If you want, in an env with a Python 3.5 interpreter, want to install version N (e.g. from some description file analogous to Manifest.toml), the package manager would balk, and offer you more or less useful debugging info which compat bounds cause the conflict.
It would not install the package and cause you to discover the problem later.

So, I donā€™t know what your below statement is grounded in? :confused:

But it still likely caused a compatibility issue for many folks trying to setup environments in python 3.5, and I doubt anyone called out the python developers for that.

If you are not using the packaging metadata correctly (e.g. by using 3.6+ features, but not giving the correct Requires-Python entry) then :person_shrugging: thereā€™s not much the package manager can do, and I guess this is true for Python as well as Julia?

1 Like

This is my larger point. The sense I get is that a versioning/metadata issue like this in Python would be blamed on the package, but in Julia often gets blamed on the language. Part of it I think, is that Pythonā€™s mega packages (NumPy, SciPy, etc.) take up one dependency, whereas in Julia there are many (dozens? more?) dependencies for the same functionality, so more versioning issues exist simply because more packages are necessary?

edit: the idea of better tooling here for updating lower/upper bounds seems to be a good one.

2 Likes

Turing.jl was built off of <$100kā€¦ I canā€™t even imagine what I could do with $5M. Thatā€™s more than Stan got for sure.

I have no idea and I think thatā€™s a major part of the problem. Only a very small number of Julia devs (2-4 maybe) know what kind of deep internals Zygote relies on.

@mihlybaci what you are getting at is spot on, Julia does take non-breaking releases and semver more seriously.

I wrote my first package for Julia something like 5 years ago and have barely changed it since. Its heavy on macro use and touched some internals to return the right line number nodes.

Some of those internals changed around 2020, and the tiny PR to keep it working on all julia versions was submitted by a core developer when they caught it in testing.

Iā€™m not sure if that happens in other languages, but it seems unlikely. Who else even tests every package in the registry with every release?

17 Likes

Enzyme really doesnā€™t have a big need for Julia compiler features. We still support Julia 1.6 (and beyond).

The excecption to that is that Iā€™ve found bugs in Juliaā€™s garbage collector that Iā€™ve upstreamed to Julia which are only available on the later versions. What we do there is we work around them with a slower fallback if we were to generate them and use the faster ones otherwise (see https://github.com/EnzymeAD/Enzyme.jl/blob/daa30d68bd95fa6bc635decf6205b18225c24d51/src/compiler.jl#L6534 here for example).

Of course, we still can take advantage of new features in later julias for better error bactraces/etc, but itā€™s not a requirement for Enzyme to work.

In contrast, I think this was a much bigger issue for Diffractor.jl, which had a requirement for near bleeding edge julia for the infrastructure required to build it.

5 Likes

As someone working in this space.
Such a version doesnā€™t exist and cannot exist.
Because

  1. There will be a new ideas for how to do AD better that will require support in the compiler to make possible.
  2. The language will add a new feature that will need support from the AD engine.

Now of course that doesnā€™t mean you canā€™t freeze both your AD and yourjulia version and have something that keeps on working as well as it ever did. But thatā€™s also true today, like around the time of any stable julia release, give or take a few months.
You donā€™t even need the LTS for that

This, is true.

4 Likes

If there are bugs in the compiler of the LTS release Iā€˜m surprised the bugfixes are not backported. Do they require new features?

On the main topic, I agree with @oxinabox, I donā€™t think we can envision that things are going to slow down in the near future. I suspect the next LTS wonā€™t have static compilation, wonā€™t have everything the AD engines need, wonā€™t have finished moving the stdlibs out into upgradable packages, will have lots of error messages and documentation that could be improved, etc. In other words, there will continue to be big, exciting and/or important changes made for as long as one can foresee. Why would you deny users the benefits of timely improvements?

@aplavin, I have become more convinced of your point, not through my own experience but from the cases where youā€™ve given a concrete, reproducible example. Iā€™d strongly encourage you to keep better track of these examples when you encounter them, because documenting these failures and reporting them as issues to the package maintainers is one of the few routes towards fixing this unfortunate situation.

Iā€™m also excited about Add public keyword by LilithHafner Ā· Pull Request #50105 Ā· JuliaLang/julia Ā· GitHub (as noted by @ufechner7); I think that, plus a good Aqua test for packages once that merges, might help a lot.

19 Likes

Backporting requires manual intervention (sometimes, a lot of it); we basically have one amazing but overstretched backport fairy (Kristoffer Carlson) for the entire language. At this point, itā€™s pretty selective what gets backported to 1.6.

7 Likes

For sake of understanding, what are you thinking of by ā€œwonā€™t have everything the AD engines needā€?

While definitely, I can attest to benefits of new features in later Juliaā€™s making AD easier, I can only think of Diffractor which requires bleeding edge features?

2 Likes

Presumably Tim more meant ā€œwantā€ instead of ā€œneedā€.i.e. people developing AD engines will continue to run into things with Juliaā€™s runtime that are either inconvenient or blocking certain features for them, and they will want upstream changes to juliaā€™s internals.

Sorry, indeed I was talking about Diffractor. It is indeed my impression that Enzyme is very standalone and works beautifully with current Julia versions.

1 Like

I personally donā€™t see how making fewer releases will solve anything in isolation. That just means that each release will be bigger and increase the likelihood of issues sneaking through the quality checks.

If anything, more frequent smaller releases would probably be better when it comes to trying to reduce the level of accidental breakage of packages.

So the focus should probably be on what actually ends up getting merged to master, not the release cadence.

19 Likes

I donā€™t think so. And I would want to go the other way, have more frequent releases.

I donā€™t see much of an ecosystem quality problem, relating to new Julia versions (I think others agree here, except for a minor update problem, and I proposed how to fix that):

I obviously donā€™t control the frequency of releases, but if each one is trying to be a smaller upgrade (maybe not trying to be very small) then they can be more frequent, and less of a worry to miss the feature freeze. People might argue itā€™s too much work, but we have up to 6 point releases, and with more 1.x releases, I think we need fewer if any 1.x.y, so less backporting work.

[Julia is the first language Iā€™m involved with contributing to. Iā€™m not sure, does Julia have more regressions than others? I have no way to back that up, could actually be fewer. Do people think too many/more than expected? And Iā€™m thinking of genuine ones, fixes (or needing) in point releases, not talking about stuff an upgrade of packages fixes.]

I wonder if thereā€™s anything actually actionable here, for example from the PoV of package developers. How can they fix the breakage after the fact?