[Article] Dear Google Cloud: Your Deprecation Policy is Killing You

This article is a work of art:

I never really thought much about this topic, but this article picqued my curiosity. Are we (Julia community) doing enough thinking about backwards compatibility? While passively observing discussions about semver, I get the impression that we are actually ok with breaking code written for v1.0 when v2.0 is released (pretty much the definition of semver), but is that really ok?

I am not entirely sure how Compat.jl works, but it looks like it is for packages to support older Julia versions, but it requires changing your code, which doesn’t seem ideal. I don’t think there is currently any expectation that code written for v1.0 will work with v2.0.

When v2.0 is released, is there some way to guarantee that code written for v1.0 will still work? Should there be? Could there be some kind of internal compat to support older code?

I wouldn’t be surprised if this conversation is already happening somewhere, so references are welcome :slight_smile:


there is always th python 2->3 saga to remind us and it more directly relevant. There was recently a PSA about Julia 2.0 I think.

1 Like

I get the impression that breakages will happen only if there is no other way and the benefit justifies the cost. Cf


Julia is setup a lot better than Python in these respects for two major reasons.

One is immutability. Especially with the Pkg server, the Julia v1.0 world has large amounts of immutability. What happens in the Python world is that packages tend to roam pretty free, so without constantly keeping up to date you won’t have a working installation. Pip is terrifying. And it can be hard to recreate an environment of 3 years ago.

While upper bounds cause their own problems, one thing that is clear is that there are so many upper bounds in the Julia ecosystem that old things won’t move. In DifferentialEquations.jl, Julia v1.0 will should get the same installation that has been there for a few years now. It won’t have the fancy new neural PDE stuff, but hey, the ODE/SDE/etc. software is still there. When Turing.jl decides to cut compability, that doesn’t mean it will go away, it just means there will be an immutable version that is there for people to rely on essentially indefinitely, unchanged. In those regards, being able to recreate past environments is really strong in Julia, which allows developers of packages to keep moving forward even if users don’t want to, and that separation isn’t bad it’s quite good! What the article is pointing out is that Google explicitly is not allowing that, partially because infrastructure is very different from package ecosystems, but also because of a philosophical difference. The core developers of Pkg have been very clear about this immutability, and it is something that many groups rely on. So upgrade Julia and package versions at your own pace: that’s how it’s designed and it’s okay!

But secondly, because of the pre-1.0 times being hectic in a universe where the competitors were well-established, Julia had to build a lot of infrastructure to help developers to upgrade. Femtocleaner, Compat.jl, @deprecated, etc. I’d estimate that 90% of major version upgrades are automated. Are there still issues that can come up? Yes, they always can and they always will. But, (a) immutability means that if you need something working today, use the old version and that’s okay, and (b) the vast majority of the changes are handled by automated or semi-automated tooling provided by the core development, and are extensible in a way so that package developers also get similar tooling for their users. One of the main issues in the Python 2 → Python 3 change was the lack of such tooling: it was considered the developer’s problem to upgrade, not the language author’s problem. I think everyone on the core team feels it to be part of their duty to ensure that updating versions is as smooth as possible, even during breaking changes like v1.0, and that’s a major philosophical difference (though to finalize the Python 3 change, the philosophy in the Python world had to change too!).

Moral of the story is, there’s still a high velocity in the Julia world, but I feel pretty comfortable. The other day I picked up someone’s blog post from 2017 or 2018 and took their ODE model and it created the same plot. Things move fast and DiffEq/SciML may be moving one of the fastest, but model codes still work.


In my experience the julia environment prefers moving forward and break to keeping strict backward compatiblity. The effort to adapt packages to new and breaking language features is delegated to package maintainers. Compat and tools will help, but not at 100%. A recent example for me was that the (standard) random generator is version specific - so if you run your code on 1.5 vs. 1.4 you get different numbers - which can hurt a lot in testing.

(and for sure you will find someone on the thread commenting that the situation is worse in language X, which is actually not helpful in solving problems in language julia).


Use https://github.com/rfourquet/StableRNGs.jl




I hear you guys. I still think it should be technically possible to release v2.0 without breaking v1.0 though even if it means some redundancy (like the Android example).

You can see Android’s commitment to backwards compatibility in their APIs. It’s a sure sign, when there are four or five different coexisting subsystems for doing literally the same thing, that underlying it all is a commitment to backwards compatibility. Which in the Platforms world, is synonymous with commitment to your customers, and to your marketplace.

It would definitely be harder, but it seems like it would be worth it in the long run. I think our habit is to deprecate something in one release and then break it in the next. Couldn’t it just stay deprecated and supported forever (like the Emacs example)?

In the Emacs world (and in many other domains, some of which we’ll explore below), when they make an API obsolete, they are basically saying: “You really shouldn’t use this approach, because even though it works, it suffers from various deficiencies which we enumerate here. But in the end it’s your call.”

Note, I’m talking about Julia itself. Not just packages.

[Edit: I think if any community can crack this, the amazing Julia maintainers could.]

Think like this: Do you volunteer to do it?


Possibly — but it would distract them from providing amazing new improvements to the language.

Recognizing that maintainers are a finite, scarce resource, personally I would prefer if they could keep focusing on language improvements instead of supporting deprecations forever. That’s a long time.


No. So what?

1 Like

Sure. And that would be a totally reasonable thing to do. I just keep thinking about the word “greedy” and what that means and could it be extended to a greedy commitment to backward compatibility with all the long-term benefits and goodwill that comes with that.

1 Like

Note that this issue stems from a misunderstanding of the Random API, StableRNG is still stable so far.


If I think back to the v1.0 release and we went from v0.6 to v0.7. The v0.7 was basically the v1.0 release, but with deprecations to give people a chance to update their packages. The v1.0 was the same as the v0.7 release except those deprecations became errors. We would never introduce a breaking change without deprecating it first. So why can’t we just leave it deprecated (rather than removing deprecated methods) for backward compatibility? For v0.6 to v1.0 it made sense, but maybe we could do something different for v1.0 to v2.0.


I’d like to emphasise that there is also a crucial difference between an online service and an offline software. I cannot roll back the Google Cloud API to an old version as an end-user, but I can recreate an environment with Julia 1.0 and all the dependencies easily.

That of course does not solve the problem that I still have to possibly rewrite code in order to run under Julia 2.0 but at least I can run it. I think that nowadays with Docker and other containerisation solutions, we are in a much better position than in past days where you had either nothing or slow VM solutions.


You just put an upper bound on the ostensible benefits — they are not worth your time.

Why would they be worth someone else’s, if you are one of the people who cares about this issue?


Of course, I will help if I can. I didn’t find that comment particularly helpful (and flagged it actually) and should have just ignored it.

It is with good intentions I bring this up. I think v2.0 is still far enough away that it makes sense to have this conversation. Like I said, knowing the Julia devs, I would be surprised if it isn’t discussed somewhere already. I never really thought about it and went along with the semver mantra of “deprecate, then break”, but the article had me wondering if we can do better for v2.0. That is all.


Let things deprecated without erroring out will be at some time point in future very annoying and frustrating. Package maintainers could postponing needed updates to their package, because it still works. Maybe deprecation warnings can be switched off? Even worse: no need at all to update a package.
Now imagine a new user who just has to use some number of packages: o my god, what is this? The terminal is full of warnings? Not very trusting.

A second point: I am sure, having all things backwards compatible will be quite costly in terms of performance and memory usage. Just not optimal.

On the other side: making everything optimal as soon as it appears would be very breaking. Not a good idea, it would limit or stop creativity of users/developers.

So, the conclusion is clear: find a reasonable compromise of how long to keep things stable and when to break old stuff.

Now: are the Julia developers doing this compromise well and right for me? It doesn’t matter, because I still have some production code running within Julia 0.3 and it is just working perfect and self contained.


Very related: Thoughts on eventual Julia 2.0 transition


Maybe I am misunderstanding something here, but what you propose here for 2.0 sounds to me like it could just be called a 1.x release. Deprecations aren’t possible for every feature, especially when it’s changes in the parser or bigger language features. The point of a 2.0 release would be to make some breaking changes that perhaps were missed in 1.0 and that can’t be introduced in non-breaking ways. That of course doesn’t mean that 2.0 will just break stuff at random, I think the bar will be pretty high for breaking changes. It also looks like 2.0 is still a few years off, since 1.x currently serves us quite well, but there are some smallish breaking features I’d really like to eventually see in 2.0.