Or ship a system image with the packages you plan to use for the course. You’d have to make the binary on Windows, Mac, and Linux but it’s possible (and possible to do in CI, this would be interesting to test with SciMLDocs). But Pluto cannot make use of system images so that would be the issue if you wanted to use Pluto in the course.
This seems unlikely, given the amount of binary data increase in
1.8.5: 900Mb vs
1.9.0-beta3: 6.5Gb (storage costs ?).
btw it’s been years Pluto still doesn’t have a doc, but I thought this would be a good solution for class setting?:
BTW, if anyone wants to help reduce precompile times, this just landed:
If you open up the CI to grab the artifacts you will find the binaries of the Julia build with that PR with the feature to time precompilation. It would be really nice for someone to run this on a few packages (OrdinaryDiffEq, Plots, Makie, etc.) and see if that can identify some common dependency hotspots.
If I click on the html example of PlotoVista I get an error message:
ArgumentError: Package TetGen [c5d3f3f7-f850-59f6-8a2e-ffc6dc1317ea] is required but does not seem to be installed:
Sorry for the title! I really spent a lot of time tweaking my post to avoid an inflammatory tone, and to keep it positive-minded and constructive. But I did not think much about the title, and what it looks like to people outside of my bubble. Just updated!
Pluto precompiles in parallel just like the REPL, we call
Maybe Pluto have a “package depot” in the first page, where one can install a bunch of packages (with a proper progress bar), and the notebooks launched then searched on that depot for installing before downloading a new version.
That would uncouple a little the “installation” and “using” phases, and with current precompilation the usage of the notebook itself would be nicer.
That could sort of handle that situation of the instructor indicating which packages to install as a Pluto functionality.
Thank you Tim!
Hearing about the experience in your class is exactly what I needed to hear to relieve my worries. I have recently worked a lot on making the launch process more transparent (like the new Status tab), and we are planning to add automatic messages like your suggestion, when we see that a process is taking quite long.
After spending so much time improving the first launch experience, it was a frustrating setback to see the big increase in launch times, when you expect new Julia versions to always be faster. But on the other hand, this makes my current and future work feel more valuable
Thanks everyone else for their suggestions!
Some suggested a bespoke setup, optimised for packages used in a course. I have avoided this in the past because of the added complexity, but perhaps Julia 1.9 means that I need to change my stance here, since the benefits are now much larger.
Sorry to hear that there were setbacks. If they can’t be easily addressed (I’m hoping they can be), then we should fix it!
I’m always surprised in these kinds of threads why courses don’t seem to use a manifest where they’ve checked all versions work and then
]pin --all the versions. That way, there are no accidental updates invalidating any caches and everything should still be ready for use after the first use. Is the option to pin versions just not well known? It’s pretty much required for reproducibility after all.
From what I can tell,
pin --all (I was not aware of this option) only pins the direct
deps, not the indirect ones. So I’m not certain it’s sufficiently comprehensive to prevent all upgrades.
Good idea, I think. Here’s a PR for allowing
pkg> pin/free --all -m to pin/free all indirect deps too
Would a sh (Linux and macOS) or CMD (Windows) script be enough for you?
I can offer myself to write it, with comment on how to tweak the default environment.
The juliaup shell script can be downloaded from https://install.julialang.org/ .
I think we may need more utilities to manipulate and copy project environments and their manifests.
I wrote one at one point:
Ah, that explains it then I always recommend the use of it if you want to enforce some set of versions and don’t want to change them, e.g. for reproducibility of a paper. In a class setting, this can also be used to make sure that students don’t inadvertently update dependencies and use a version that’s not “sanctified”/tested by lecturers and run into issues that shouldn’t happen in the first place. This can be implemented very easily by a
diff of the
Manifest.toml of the assignment and the
Manifest.toml submitted by the student. The details come down to philosophy of teaching though, e.g. if you want to allow students to add new packages or don’t want as strict requirements, a different workflow may be required.
Maybe not helpful comment. Why do we assume these first time students must run on their laptops?
The instructor could use an on premise or cloud server with lots of cores and RAM.
Start multiple Pluto notebooks.
Give web based access to the students in the class at class start time.
As an aside, the ideas to publish Time to First Pluto benchmarks are assuming that the same hardware is available over several years.
I will venture to say that the change from spinning drives to solid state probably has had a huge effect on compilation time for many languages.
Also caches in the newest generation of CPUs are getting larger.
I don’t fully understand that Here is what I tried:
- activated a new project, added
DataFramesto it. That generated one
- then activated a new project, added
CSVto it. This did not trigger creation of a new
.julia/compiled/v1.9/DataFrames, as far as I can tell in this new project it is just using the same
dllfile that was created in the previous project.
So to me this looks like the
dll files are not created per package/project combo, but rather just per package version, and if that package version is used in multiple projects then it is reused. If that is so, doesn’t that suggest that the content of this
dll combo is not dependent on what other packages there are in the project whos precompile created these files? And if that is so, couldn’t we just store that exact version of
dll file in the cloud?
I’m probably missing something right now, but if there is only one precompile version stored on my system per package version, then I don’t understand why that can’t be cached somewhere other than my system.