@aplavin Thanks for this tip. This is how I have started organizing my production and development projects. However, all custom packages from diametrically opposite projects are still dumped into /.julia folder no matter where I create project and manifest toml files for the “production” app. Based on what other experts on this thread have said, Julia’s “gc” is the way to deal with getting rid of unused packages from the depot /.julia folder, but there is no way currently to keep packages from multiple projects separate. The fact that “gc” is automatically deleting unused packages from /.julia is still unsettling since it takes away the ability to control (what and when they want some unused package removed) packages from the end-user.

PS#I don’t think this is better in any way than Python, if you have used ‘poetry’ to manage your packages. If you have only used “pip” for package management, then I agree with you, but with “poetry”, package management in Python is as good if not better than what we have here since separate projects can stay separate and can be deleted on-demand with ease by the end-user, which is clearly not the case with Julia.

@StefanKarpinski Thanks for pointing that out.

What’s the problem with this? Why is it unsettling?

PS#I don’t think this is better in any way than Python, if you have used ‘poetry’ to manage your packages. If you have only used “pip” for package management, then I agree with you, but with “poetry”, package management in Python is as good if not better than what we have here since separate projects can stay separate and can be deleted on-demand with ease by the end-user, which is clearly not the case with Julia.

Why is this useful?


For instance, if I git checkout an old version of my code, I may want its dependencies to still be on my computer.

The tools I use that have automatic gc have a setting for when it should be run. One (Nix) also lets me pick some files as “gc roots” that shouldn’t be collected regardless of other conditions.




I don’t see why you want your computer to store all the dependencies of 100s and 1000s of different projects you will work on, which immediately leads to bloated/.julia folder and in few years time creates confusion because of different packages and their versions. IMHO, a better solution would be, when you git checkout your project, you have the ability to build those dependencies so you won’t have any issue with the reproducibility of your project. In python ( not trying to advertise Python, simply looking for similar features, which I have found to be extremely handy, in Julia) , one can do ‘poetry install’ to install all the dependencies of the project, defined in pyproject.toml and manifest.toml files which are intrinsic parts of the project itself (just like in julia), that you checked out from github, which will build those dependencies locally under the project folder so that you can run your project stand alone (since it has now the dependencies it needs) many years from now without any issue. After you are done working on this project, either remove the project folder or simply remove the folder which stores all the dependencies. Note that dependencies are built inside the project folder without polluting your system. Your computer is not polluted by the dependencies of the project you worked on.

Wondering if we can have a similar feature like this in Julia.

I don’t see why feature like this is not useful. Having the freedom to separate (or combine if we wish/prefer) the dependencies of various projects just creates less confusion, better management and compartmentalization of projects, especially when you are working on diametrically opposite projects.

For one thing, I don’t have internet access all the time. If I need to download something Pkg.jl has removed, I may not be able to proceed.

After you are done working on this project, either remove the project folder or simply remove the folder which stores all the dependencies.

That suggests a new Pkg command (or option to (]gc) for when I’m done working on a project and want to remove the installed dependencies from ~/.julia but I don’t want to touch the Manifest.toml.

I understand that this doesn’t work the way you’re accustomed to, but please do be open to how it works in Julia because it works very well once you get used to it. First, a few observations about Python package installation:

  • Python package installation is highly stateful: installation requires running arbitrary Python code, which can do anything it wants and isn’t necessarily idempotent or repeatable unless the package authors are quite careful to ensure that it is. Installing the same package version at different times or in different environments can produce very different results.

  • In Python even what a specific version of a package depends on can change based on the results of running code. That’s one of the reasons that it’s hard to resolve Python dependencies: you don’t know what a package version even depends on until you’ve tried installing it.

This means that if you want to keep a project working, it really needs its own local, unshared set of dependencies that are siloed from all other projects. By comparison, here are some features of how Julia packages management works:

  • Julia package installation is stateless and reproducible. Installing a package doesn’t involve running any package code: a tarball of the source tree of a package version is unpacked in the right path in $depot/packages and that’s it. The file tree is installed read-only and never modified by the package manager, the package or anything else.

  • Moreover, Julia package versions are identified by their git SHA1 tree hash, so they’re inherently verifiable: if you compute the tree hash and it doesn’t match, you have the wrong package source.

  • (There is a legacy exception to the rule that no package code is run during installation: if a package contains a file called deps/build.jl it will be run after installation to allow the package to build things, for example, downloading some data or compiling some source code. This is, however, only allowed to modify the contents of the deps directory. Also, using deps/build.jl is no longer recommended: there are better—stateless and reproducible—mechanisms for doing this kind of thing these days, as described in the next three points.)

  • If a package needs to download and use immutable data, it should use artifacts, which are automatically downloaded and installed when a package declares that they are needed by including an Artifacts.toml file. Like packages, they are immutable and content addressed, identified and loaded by git tree hash, making them inherently verifiable and perfectly cacheable and reproducible. Since artifacts are immutable, it is safe to share them between multiple package versions and even between different packages. This can save considerable space since artifacts can be large. Artifacts are automatically cleaned up by pkg> gc when there are no more package versions that depend on them.

  • The artifact mechanism is also used to install binary dependencies such as pre-built C and Fortran libraries, a large body of which can be found in Yggdrasil. This means that not only is Julia package installation stateless and reproducible, but binary dependency installation is as well. This makes it very reliable to install and setup a Julia project, from the source code to the binary dependencies that it uses.

  • If a package needs to work with mutable data, rather than using package directory for this, it should use scratch spaces, which let packages setup transient scratch directories in which they can download/generate whatever they want and have it persist across package usages. Moreover, it can be shared by different versions of the same package, instead of each version generating its own copy, and scratch spaces are automatically cleaned up by pkg> gc when there are no more versions of the package that need it.

  • Julia packages and artifacts are served to Julia clients by a global network of package servers which can be reached at https://pkg.julialang.org. They are served by URLs that look like

    • https://pkg.julialang.org/package/$uuid/$hash
    • https://pkg.julialang.org/artifact/$hash

    When you download one of these you get a gzip-compressed tarball of the package version or artifact tree. For example, you can download and list the contents of version 1.2.0 of the BSDiff package like this:

    julia> using Tar
    julia> Tar.list(pipeline(`curl -fLsS https://pkg.julialang.org/package/7b188ff4-8bb6-4dee-bbe1-9b6fdde2c7c5/70d0d8a17dcd4dbf44c29e849550bc6bf53c6ec8`, `gzcat`))
    11-element Vector{Tar.Header}:
     Tar.Header(".github/workflows/TagBot.yml", :file, 0o644, 204, "")
     Tar.Header(".gitignore", :file, 0o644, 31, "")
     Tar.Header(".travis.yml", :file, 0o644, 121, "")
     Tar.Header("Artifacts.toml", :file, 0o644, 281, "")
     Tar.Header("LICENSE", :file, 0o644, 2531, "")
     Tar.Header("Project.toml", :file, 0o644, 909, "")
     Tar.Header("README.md", :file, 0o644, 5577, "")
     Tar.Header("src/BSDiff.jl", :file, 0o644, 12396, "")
     Tar.Header("src/classic.jl", :file, 0o644, 3394, "")
     Tar.Header("src/endsley.jl", :file, 0o644, 2405, "")
     Tar.Header("test/runtests.jl", :file, 0o644, 7930, "")

    Here 7b188ff4-8bb6-4dee-bbe1-9b6fdde2c7c5 is the UUID of the package and 70d0d8a17dcd4dbf44c29e849550bc6bf53c6ec8 is the tree hash of the 1.2.0 version recorded in the General registry. Similarly, you can download the test_data artifact that BSDiff uses for testing like this:

    julia> Tar.list(pipeline(`curl -fLsS https://pkg.julialang.org/artifact/d2ca0cfa36769774a442b467b353dcf908186384`, `gzcat`))
    6-element Vector{Tar.Header}:
     Tar.Header(".gitignore", :file, 0o644, 39, "")
     Tar.Header("LICENSE", :file, 0o644, 1100, "")
     Tar.Header("registry/after.tar", :file, 0o644, 13685760, "")
     Tar.Header("registry/before.tar", :file, 0o644, 13349376, "")
     Tar.Header("registry/classic.diff", :file, 0o644, 13792104, "")
     Tar.Header("registry/reference.diff", :file, 0o644, 13792152, "")
  • Because of immutability and content addressing, these tarball URLs are perfectly cacheable: cache invalidation is never required since the content of a package version or artifact cannot ever change. Moreover, the package server system stores them permanently in multiple storage locations, including S3 buckets belonging to the JuliaLang AWS account, which means that if you’ve installed something via package servers in the past, you’ll be able to do so in the future as well (forever).

All of this means that Julia package management is very different from Python package management. In Python, if you want any hope of reproducibility or even keeping a project working in the future, you must have a local set of packages for it that are siloed from the dependencies of any other projects. Otherwise there’s a very real risk that a package operation in one project will trash an unrelated project. If you delete or modify the packages that a project depends on, you may not be able to get them back into the same (working) state they were in when it was working.

In Julia, on the other hand, your installed packages and artifacts are essentially just a cache. By design, all the information you need to reconstitute everything a project depends on is recorded in its Project.toml and Manifest.toml files. You can delete all of the packages and artifacts that a project depends on and just do pkg> instantiate and it will download and install everything that’s needed, and since the package servers remember anything you’ve installed forever, you can do this at any point. This works so reliably that people regularly just delete their ~/.julia depots and reconstitute them from scratch.

So having local, siloed sets of packages and artifacts for each project in Julia just isn’t necessary or useful the way it is in Python. You can, nevertheless have a local depot path if you want to by setting the JULIA_DEPOT_PATH variable to the project directory like this:

$ mkdir MyProject
$ cd MyProject
$ export JULIA_DEPOT_PATH=$(pwd)
$ julia -q --project=.
(MyProject) pkg> add BSDiff
  Installing known registries into `~/tmp/MyProject`
    Updating registry at `~/tmp/MyProject/registries/General.toml`
   Resolving package versions...
   Installed Preferences ──────── v1.2.3
   Installed CodecBzip2 ───────── v0.7.2
   Installed Bzip2_jll ────────── v1.0.8+0
   Installed SuffixArrays ─────── v0.3.0
   Installed BSDiff ───────────── v1.2.0
   Installed BufferedStreams ──── v1.0.0
   Installed Compat ───────────── v3.41.0
   Installed TranscodingStreams ─ v0.9.6
   Installed JLLWrappers ──────── v1.3.0
  Downloaded artifact: Bzip2
    Updating `~/tmp/MyProject/Project.toml`
  [7b188ff4] + BSDiff v1.2.0
    Updating `~/tmp/MyProject/Manifest.toml`
  [7b188ff4] + BSDiff v1.2.0
  [e1450e63] + BufferedStreams v1.0.0
  [523fee87] + CodecBzip2 v0.7.2
  [34da2185] + Compat v3.41.0
  [692b3bcd] + JLLWrappers v1.3.0
  [21216c6a] + Preferences v1.2.3
  [24f65c1e] + SuffixArrays v0.3.0
  [3bb67fe8] + TranscodingStreams v0.9.6
  [6e34b625] + Bzip2_jll v1.0.8+0
  [0dad84c5] + ArgTools v1.1.1
  [56f22d72] + Artifacts
  [2a0f44e3] + Base64
  [ade2ca70] + Dates
  [8bb1440f] + DelimitedFiles
  [8ba89e20] + Distributed
  [f43a241f] + Downloads v1.5.1
  [7b1f6079] + FileWatching
  [b77e0a4c] + InteractiveUtils
  [b27032c2] + LibCURL v0.6.3
  [76f85450] + LibGit2
  [8f399da3] + Libdl
  [37e2e46d] + LinearAlgebra
  [56ddb016] + Logging
  [d6f4376e] + Markdown
  [a63ad114] + Mmap
  [ca575930] + NetworkOptions v1.2.0
  [44cfe95a] + Pkg v1.8.0
  [de0858da] + Printf
  [3fa0cd96] + REPL
  [9a3f8284] + Random
  [ea8e919c] + SHA v0.7.0
  [9e88b42a] + Serialization
  [1a1011a3] + SharedArrays
  [6462fe0b] + Sockets
  [2f01184e] + SparseArrays
  [10745b16] + Statistics
  [fa267f1f] + TOML v1.0.0
  [a4e569a6] + Tar v1.10.0
  [8dfed614] + Test
  [cf7118a7] + UUIDs
  [4ec0a83e] + Unicode
  [e66e0078] + CompilerSupportLibraries_jll v0.5.0+0
  [deac9b47] + LibCURL_jll v7.73.0+4
  [29816b5a] + LibSSH2_jll v1.9.1+2
  [c8ffd9c3] + MbedTLS_jll v2.24.0+2
  [14a3606d] + MozillaCACerts_jll v2020.7.22
  [4536629a] + OpenBLAS_jll v0.3.17+2
  [83775a58] + Zlib_jll v1.2.12+1
  [8e850b90] + libblastrampoline_jll v3.1.0+0
  [8e850ede] + nghttp2_jll v1.41.0+1
  [3f19e933] + p7zip_jll v16.2.1+1
Precompiling project...
  15 dependencies successfully precompiled in 3 seconds
$ ls -l
total 12
-rw-r--r--  1 stefan staff 6458 Jan  1 16:15 Manifest.toml
-rw-r--r--  1 stefan staff   55 Jan  1 16:15 Project.toml
drwxr-xr-x  3 stefan staff   96 Jan  1 16:15 artifacts
drwxr-xr-x  3 stefan staff   96 Jan  1 16:15 compiled
drwxr-xr-x  5 stefan staff  160 Jan  1 16:15 logs
drwxr-xr-x 11 stefan staff  352 Jan  1 16:15 packages
drwxr-xr-x  4 stefan staff  128 Jan  1 16:14 registries
drwxr-xr-x  3 stefan staff   96 Jan  1 16:15 scratchspaces

As you can see, this installs project-local copies of packages, artifacts, etc. You might want to symlink the registries directory to ~/.julia/registries so that it’s shared. Same with logs. But then again, why not just share all of the directories since there’s no danger in sharing packages or artifacts since they’re immutable?

Regarding the concern that keeping packages and artifacts in a shared location will cause bloat, I’ve already explained how pkg> gc works—it prevents exactly the bloat you’re worried about. In fact, having a single place where packages and artifacts live reduces bloat since a single version can be shared by as many projects as need them. To clean up unused packages or artifacts, just do pkg> gc --all and they’ll be deleted. Or do nothing and let Julia do it automatically when you do package operations. As @jzr has mentioned, it would be good to have a Pkg command to forget about a manifest so that you don’t have to delete it before doing gc in order to clean up its dependencies, but you can actually already do this manually quite easily: manifests that you’ve used are recorded in ~/.julia/logs/manifest_usage.toml which looks like this:

time = 2021-10-11T17:31:25.178Z

time = 2021-09-09T16:45:36.021Z

time = 2021-12-13T11:45:42.971Z

time = 2021-09-09T16:45:36.122Z

time = 2021-09-09T16:45:36.042Z

time = 2022-01-01T16:42:18.669Z

time = 2021-09-09T16:45:36.049Z

time = 2021-10-12T20:48:13.562Z

time = 2021-11-02T17:21:43.901Z

time = 2021-11-29T11:03:09.906Z

You can open this in an editor, delete any entries you don’t care about keeping the dependencies installed for anymore, and then do gc again to clean up. It might be good to have a command to remove a manifest from the usage log, but editing the text file is already pretty easy—I do it periodically to purge dependencies of projects I don’t need to keep around anymore.

It would definitely be a nice feature to be able to install all of a projects dependencies locally to the project, but not for the sake of reproducibility or disk space (since it doesn’t help either of those in Julia), but for the sake of being able to ship someone a self-contained application bundle that can be used without needing to download anything else. However, if that’s what you want, PackageCompiler does this for you and also compiles the application into a custom binary, which is why this functionality hasn’t been pressing to develop in Julia itself.


A first-order effect of storing dependencies in a single location and not duplicating them for each project (as python does) is just less disk space usage. A second-order effect is arguably more important: it’s common to have lots of small isolated “projects” in julia. For example, each pluto notebook is a separate julia project by default, bringing all those convenience and reproducibility features. This wouldn’t really be feasible when storing all dependencies locally to each project.


Yes, very much so. With this approach creating a completely isolated environment is very lightweight—all you need to do is make two text files. It’s also fairly common to make a new environment and not have to install any new package versions because they’re already present.


I’ll add that the way this works was heavily inspired by the way git works. Everything in git is immutable and content addressed and rather than mutating things in place, you create new immutable objects. On top of that there is a lightweight layer mapping changeable human-facing names like branches or version numbers to immutable objects. Creating or updating those human-facing names is very cheap. This corresponds to creating branches in git or creating environments in Pkg. People familiar with git will note that it also has a gc command that removes unreachable objects.


Thanks all for clarifying fine points. Much appreciated.

I did not want to stretch this any further (since I agree with most of the things said here), but one point I wanted to clarify was that Python does allow you to have it both ways: if the user wants to reuse dependencies, one can simply put them in the PYTHONPATH (while building the dependencies, so it does not pollute your system at any point) and poetry (feel free to check it out) will not install/build those dependencies if the versions required by the project are satisfied by packages/dependencies found in the PYTHONPATH. Just like Julia, poetry uses pyproject.toml and poetry.lock files (two text files) which is all you need to build all the dependencies without any ambiguity at any point now or in the future to run the project. Python does not really need ‘gc’ command to manage unused packages I guess for obvious reasons. [Again, want to emphasize that I want to use Julia for my projects, hence this discussion. I am bringing Python into the discussion just for the sake of comparison and to see how Julia handles its business just so I use it the way it was intended.]

What a great write-up Stefan! It should be “pinned” or something. A must-read!!

I didn’t know about manifest_usage.toml, curious to see what’s inside. :joy:


A few (genuine) questions:

  1. Does the PYTHONPATH mechanism allow for installing multiple shared versions of the same package to be used from different local environments or can you only have one version of a given package installed centrally at a time?

  2. What guarantees does Python/poetry provide for the availability of package source and binary dependency downloads in the future?

  3. What guarantees does Python/poetry provide regarding reinstallation of packages producing the same results in the future given that Python package installation in general allows arbitrary, potentially non-repeatable code to execute during and affect the outcome of package installation.

  4. Given that there is no package gc, if you choose to install shared packages centrally, how do you get rid of them when no project needs them anymore?


That’s not what that feature is for. PYTHONPATH is just JULIA_LOAD_PATH.

PyPI permits deletion, but it’s only supposed to be used in rare circumstances such as if you accidentally publish secrets into a package. If you want to remove a package version from PyPI e.g. if it’s broken, you can yank it which will cause installers to skip it in resolution unless pinned exactly with ==, so pinned users won’t change but unpinned users won’t get the broken version.

Not much. Packagers seeking reproducibility may need to patch their dependencies to ensure it.

Do all Julia General Registry packages build reproducibly on all platforms?

poetry env remove <python>

removes an environment for this package for the specified version.

(1) I get your point. It is not doable in a very elegant way, but I am not entirely sure if this is a complete showstopper or absolutely impossible. One could probably get around by installing packages in ‘multiple depots’ under a ‘central depot’ and selectively placing the inner depot with appropriate version in the PYTHONPATH so that when poetry installs dependencies for the specific project, it won’t duplicate the one it finds in the current PYTHONPATH. (Caveat: Some additional hacks to import packages may be necessary. I haven’t fully explored this avenue, since I prefer separation of dependencies and deleting project dependencies of unused project for all real-world projects)

For the rest, I kind of agree with @jzr, although I’d be interested in real-world example of the situation you described in (3).

Reproducibility is not required for a package, but the fact is that most of them are just prebuilt binaries built in a reproducible manner with a reproducible toolchain, thanks to BinaryBuilder.

1 Like

Right. To summarize, PYTHONPATH not withstanding, Python doesn’t really support the model where many different versions of packages and binary dependencies can be installed centrally and used from various local environments. Using PYTHONPATH is very much in the old “centralized installation of a single (hopefully) consistent set of versions of packages” school of package management. If you want to have isolated project environments, your only option in Python is to also install packages local to your projects.

This situation in Python is why users coming from Python who like poetry may expect and want to have project-local installation of packages and binary dependencies. However, extrapolating the problems with the old school centralized PYTHONPATH approach to Julia does not follow, since Julia is designed to have the benefits of project-local environments without having to actually install packages and artifacts local to each project.

This is the same as the Julia General registry. What I was wondering more specifically was about guarantees on how long the actual contents of packages will be serveable by the PyPI servers if, say, the upstream source for a package goes away. Do they do something similar where they save and persist registered package versions permanently?

Also, what about other things that a package may need to complete its installation? Packages often need to download data or binaries in order to function properly. Does Python do anything to ensure that will continue to be possible in the future? One of the major benefits of artifacts is that they are also served through the same package servers that serve package source, which means that they can be persisted by the same infrastructure, ensuring that Julia packages can be installed and used in the future, including data and binary dependencies. And, of course, that this is all immutable and cryptographically hashed is a bonus.

Ok, that’s was my impression.

Do all Julia General Registry packages build reproducibly on all platforms?

Packages that don’t use the legacy deps/build.jl mechanism are reproducible and verifiable because all you do to install them is unpack a tarball in the right place and those tarballs can be checked for correctness by tree hashing. Pure Julia packages are generally pretty portable. Most binaries built with BinaryBuilder on Yggdrasil are portable as well and these are also reproducible because they are also just tarballs that need to be unpacked in the right place. Arranging for binary dependencies to work when being installed as immutable file trees was quite hard but has really paid great dividends.

My question was about the central packages installed in PYTHONPATH, not poetry-managed environments. The claim was that this replicated Julia’s ability to have centrally installed packages but that Python doesn’t need any gc functionality “for obvious reasons”. That didn’t make sense to me and it seems my gut reaction was correct: if packages are installed in PYTHONPATH there’s no way to automatically remove them when no project is using them—everything is kept until it is explicitly removed. For poetry-managed, locally installed dependencies, sure, you don’t need gc since you can just delete the whole directory. But for centrally installed packages, there is no gc equivalent so you have to figure out what “bloat” can be safely deleted on your own.

This happens all the time: you have installed a Python package on your compute and have it working; I try installing the same version of the same package on my system and it doesn’t work—installation fails somehow. The “you” and “me” here could be the same person at different points in time. Why does this happen? Because installing a Python package entails running Python code that tries to do different things based on the state of the system its running on. It can fail, it can decide based on the state of the system that the package should have different dependencies than it had when it was installed on a different system.

For contrast when you install packages and dependencies in Julia, no package code is run, all that happens is that a bunch of tarballs are unpacked in the rigth places. It’s possible that those unpacked tarballs won’t work the same on the different system, but the installation step is basically foolproof.


Here is a real world situation I am encountering as we speak. Someone just asked me to run their (massive)julia project. Asked me to checkout from github. I am not interested in any aspect of this project and do not want the project and its dependencies anywhere near my own production and development codes.

I cloned the repo to a separate folder then Pkg.activate(path/to/this/project/tomls) and then instantiate. This step downloaded all the dependencies (which I don’t want anywhere near my own stuff) dumped everything into ~/.julia folder.

What is Julia’s solution for situation like this? My personal solution was to export JULIA_DEPOT_PATH and then follow the steps above (This is what I gathered from the messages above, but please correct me if I am mistaken and apologies for the repetition. Just a final word and I am all set).

PyPI (pronounced “py pee eye”, not PyPy) doesn’t have the concept of “upstream source”. You upload the source and binary to PyPI yourself.

Binaries are provided in .whl files aka “wheels” . Package owners build and upload wheels (“binary distributions”) and source distributions in their CI process or manually. See e.g. the list of requests distributions (source and binary).

Technically venv uses a separate environment variable VIRTUAL_ENV rather than PYTHONPATH, but it’s a similar idea. Poetry can install things centrally, but it still makes a separate env for each project, so e.g. ~/.local/share/poetry/envs/myproject-$pythonversion-$directoryhash/. Then running poetry env remove python3.7 from the project directory deletes the env directory.

As I said above, Pkg.jl is missing this feature. To solve it, add a command that deletes the dependencies of the current project (except those needed by other projects).

In Julia I sometimes try to use packages that “failed to precompile” during installation and which I then can’t import. I think that’s similar.


You basically have two options. Either you trust that Pkg works as designed and doesn’t mix things up and is able to collect garbage after you have removed the environment; i.e. accept that it’s fine that it all lands in ~/.julia. Or, indeed, you point JULIA_DEPOT_PATH to some alternative, possibly temporary, location. If there are packages you use for production which still use the deps/build.jl mechanism, there may be some cause to be conservative about it.


Thanks for this note. I decided to use “gc” and dump every dependencies of this project that I am not planning to work on but have to deal with it now, into ~/.julia. The next issue now is that I need to modify the project.toml file to change the [compat] version of one of the dependencies of this project (which I am not planning to work long term), but this dependency overlaps with my other project (which I will continue to work on and don’t want this new project to mess any of my dependencies). These dependencies are custom packages and are part of the custom registry.

What is the right way or Julia way to proceed in this scenario? Thanks. [In Python with poetry, one can change project.toml file either manually or using ‘poetry add’ and then run ‘poetry update’ to update the dependency graph, again locally and we are good to go. I am looking for equivalent of ‘poetry update’ in Julia)