Speculations about the default environment (or a new draft environment)

Hi.
I want using julia to stay simple, without having to do anything at all about environments - unless i need to, so as to get some package combo to work. Then, i want one way that can deal with all the exceptions (which ive never encountered - in years). Lol me me me!
Im fine with a bloated default setup and startup lag, if that means my path to the result is simple. For me, complexity costs big time. I just want to get stuff done. Current default setup gives me that.
Please julia team, keep it simple. Its one big reason i switched to julia.

3 Likes

Count me too.

1 Like

I agree, but mainly because I am spoiled by R and it’s CRAN ecosystem, where bloated library installations have never resulted in dependency issues. Man are you in for a rude awakening when you transition to Python (pip, conda, venv makes no diff, after 6 months shooting for an env update is „suicide“)
:slight_smile:
Tbh I must say that so far Julia has pleasantly surprised me being somewhere in the middle of these two experiences, but I do like the direction this post is going …

Pkg already does all of those things.

Again, all of those things are already implemented in Pkg.

13 Likes

I don’t know how Nix works, but can we in Julia have in the same environment two packages which depend on two different (breaking) versions of the same third package?

1 Like

Some people here are implicitly questioning if this is really a thing because they’ve had no problems running a single environment. Here’s one factor that quickly becomes a ticket to dependency hell in a single environment: pinned packages (or upper version bounds in general). Either your own pins, or perhaps you’ve installed a package that pins some of its dependencies.

Pins should probably be viewed as a quick but temporary fix, used when some dependency has introduced a bug or modified its API more than you can deal with right now. But if you leave it pinned for an extended time conflicts will arise as the ecosystem moves on. And the problems will disproportionally affect Julia newbies who tend to have everything in a single environment. I only very recently realized this and am myself very much guilty of unnecessary long-term pinning, but I am now working to remove pins in my packages. So please do not pin dependencies for an extended time in your public packages.

2 Likes

The issue arises in the same manner if one of the packages just does not update the [compat] entries regularly.

I think it is clear that interoperability of the interfaces and of the ecosystem in general depend on the compatibility of the package versions that are dependencies of different packages. This interoperability is the notorious feature of Julia that makes the situation a little bit more complicated than in other languages where “types” do not permeate into the inner workings of everything.

I believe Julia does not allow this, nor should.

While it may be tempting to allow this kind of compatibility, it has the same pitfall that using include multiple times. You will end up with multiple versions of the same struct, which can be incompatible to each other (i.e., the struct from one version may not have the same fields of the one from another version).

If packages A and B depend on different versions of package C each, and code in A return C.OneStruct which is passed to a function defined in B what the hell should happen? B should start using the same version of C that A is using? B should error because the struct is not really the same? B should try to use code in its version of C like the struct from the other version of C was compatible?

3 Likes

Exactly, that is what I tried to convey in the comment above yours. That makes the compatibility problems somewhat more complicated in Julia than in other languages, though. But it is a consequence of the great interoperability of the language.

1 Like

I think Julia users and Nix users mean different things by this.

] add StaticArrays@v1.4.3 StaticArrays@v1.4.2
ERROR: it is invalid to specify multiple packages with the same name: `StaticArrays`

] add uuid=90137ffa-7385-5640-81b9-e52037218182@v1.4.3 uuid=90137ffa-7385-5640-81b9-e52037218182@v1.4.2
ERROR: it is invalid to specify multiple packages with the same name: `uuid [90137ffa]`

] add StaticArrays@1.4.3
ERROR: Refusing to add package `StaticArrays [90137ffa]`.
Package `uuid=90137ffa-7385-5640-81b9-e52037218182` with the same UUID already exists as a direct dependency.
To remove the existing package, use `import Pkg; Pkg.rm("uuid")`.

If package A depends on C and package B depends on D, and code in A returns C.OneStruct which is passed to a function defined in B what should happen? I don’t think this is anything special. Making this not work whenever C and D have the same uuid seems like an avoidable limitation.

2 Likes

Currently, AFAIU, B should be explicitly dependent on C, if it expects a C.struct with some particularity. If this is the case the current behavior is safer, although limiting if c.struct does not change between braking releases of C.

In this sense I think it is more controversial if the packages exposed to the user interface with data structures of base (floats, arrays). Then, it would be practical that two versions of the same dependency could coexist. One could differentiate the dependencies of the interface from those of the inner workings of the package. In the current state of things, we depend that package maintainers keep updating their compat entries so the packages remain useful. I’m not sure if that will age will in the very long term.

You’re saying that in Nix you can have multiple versions of the same library in a single “environment” (or whatever that is called in Nix)? If so, that seems to be a recipe for a disaster to happen down the line

1 Like

Nix per se doesn’t have a concept of two versions of the same package; it considers them different packages. If I try to create an environment in which two files from different dependencies would collide on the same path, it will give an error. Otherwise, at the highest level it doesn’t impose many assumptions about how programs want their dependencies arranged.

Language-specific tools can customize whatever rules are appropriate for their module systems etc – for example, the Nix Python tools might disallow two packages using the same name because Python doesn’t support that.

(It’s been a little while since I’ve been deep into packaging, so I might miss something. Anyway I don’t want to distract from the main point of the thread.)

There are all shades of grey here. For instance a package can drop out dependencies by duplicating the code inside it. For more simple functions that’s done all the time (norms, dot products, etc). It is a disaster if the package are exposing the interface of the dependencies, otherwise it would be a good thing in many situations.

(The long life of those monolithic Fortran codes has a reason, even if they become impossible to maintain, the continue to be useful).

They will not have just the same UUID, in general they will have the same module name, exported function and struct names. Even if you manage to import the modules under different names to disambiguate, this does not change the fact the struct provided in each version may be incompatible with which other and may be hard to detect and make this an error. Some problems:

  1. Such incompatibility may not even have anything with the fields of the struct, this was just the most straightforward example. The most pernicious examples is having the same fields but the code now having different assumptions about them, initializing/using them in different ways. So you either may have silently broken code, or you cannot use together two libraries that were supposed to be able to exchange data in the format defined by a third package. Because of glue code, there can be some implicit data exchange between them that you are not even aware.
  2. I am not aware of an automated way of making this not work to avoid such subtle bugs. Methods from the package C (or D) may not restrict the arguments to specific types yet expect them to be of the specific type defined in their version of the package. How do you will automatically pass through all methods of the package C code (during load) and add type annotations for the right untyped parameters so the code breaks as intended when the other version of the data structures are passed?
1 Like

This I don’t understand. You’ll never get two files to collide on the same path in Julia. Does it mean Nix uses a single global prefix? I was under the impression they’d use split prefixes, somewhat similarly to how libraries in Julia artifacts work.

2 Likes

I’m afraid I’m making things more confusing than clear, so I’d better save this for another time when I have the models at top of mind and examples available. Sorry for the tangent.

Hi,
I agree with you. I’m one "undisciplined ( always ) / new user ( only for Julialang ) . I was expecting the Julia environments to be hierarchical and hereditary → in each “src” successive projects “subdirs” I could to add (= append) a different Pkg, with the only restriction of hierarchical compatibility with the previous “Manifest.toml” dependencies. I “hack” this now copying the last pair “.toml” files of the previous src subdir in the hierarchical path.
I have been last moth study “environments, projects, packages, applications” perhaps because the only “disciplined and experience” use of current environments is to “push” every Pkg in a single “confusing” bag.
Sincerely.