Re-creating a system image is much (2.4 times) faster with 1.10-beta1 than with 1.9.2:
Julia 1.10.0-beta1
------------------
[ Info: PackageCompiler: Done
✔ [01m:47s] PackageCompiler: compiling incremental system image
Precompiling the plotting libraries...
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonCall/qTEA1/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonPlot/f591M/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonCall/1f5yE/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonPlot/f591M/CondaPkg.toml
CondaPkg Dependencies already up to date
real 3m28,316s
user 10m18,937s
sys 0m57,292s
Julia 1.9.2
-----------
[ Info: PackageCompiler: Done
✔ [05m:57s] PackageCompiler: compiling incremental system image
Precompiling the plotting libraries...
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonCall/qTEA1/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonPlot/f591M/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonCall/1f5yE/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonPlot/f591M/CondaPkg.toml
CondaPkg Dependencies already up to date
real 8m25,084s
user 8m26,257s
sys 0m43,561s
The second half of the compilation with PackageCompiler is using all CPU threads (I have 16 cores). That is nice! Only my laptop crashed in this situation when running in turbo mode…
The package pre-compilation time is not included in these number because they are from the second run. It is also much faster with 1.10-beta, but more difficult to measure.
10 Likes
The sysimage creation (at least part of it) is indeed parallel in 1.10 so that indeed can give a bit of a boost when running on a multi core machine.
12 Likes
I figured out the problem with my laptop: It was running out of ram (it has only 16GB physical RAM). Installing zram fixed this problem, see How to Configure ZRAM on Your Ubuntu Computer - Make Tech Easier
But what happend without zram was really bad: The computer stalled completely, the OOM killer did not become active, had to do a hard power off with the power switch…
My current results:
Julia 1.9.2 on Laptop 16GB RAM, 4 cores: max RAM usage 10.1 GB
[ Info: Precompile script has completed execution.
[ Info: PackageCompiler: Done
✔ [10m:17s] PackageCompiler: compiling incremental system image
Precompiling the plotting libraries...
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonCall/qTEA1/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonPlot/f591M/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonCall/dsECZ/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonPlot/f591M/CondaPkg.toml
CondaPkg Dependencies already up to date
real 19m2,094s
user 26m55,217s
sys 0m48,471s
Julia 1.10.0-beta1 on Laptop 16GB RAM, zram enabled, 4 cores: max RAM usage 24.2 GB
[ Info: Precompile script has completed execution.
[ Info: PackageCompiler: Done
✔ [07m:50s] PackageCompiler: compiling incremental system image
Precompiling the plotting libraries...
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonCall/qTEA1/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonPlot/f591M/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonCall/1f5yE/CondaPkg.toml
CondaPkg Found dependencies: /home/ufechner/.julia/packages/PythonPlot/f591M/CondaPkg.toml
CondaPkg Dependencies already up to date
real 12m18,923s
user 22m35,711s
sys 1m23,100s
ufechner@tuxedi:~/repos/WindTurbines10/bin$
Which is still a nice speedup for a 4 core laptop.
Nevertheless creating a sysimage should NOT fail in 1.10 when it works in 1.9 . I created an issue: Creating a system image fails in Julia 1.10.0-beta1 while it works with 1.9 · Issue #50729 · JuliaLang/julia · GitHub
1 Like
This is why systemd-oomd exists. It will kill off out of control processes taking up too much RAM earlier than the kernel OOM killer and makes desktop machines FAR more stable. I haven’t seen my machine go to it’s knees like that for a couple years now.
1 Like
Thanks for pointing that out, but Julia should not start multiple threads if there is not enough memory for them. To point out, I did NOT use t=auto when starting Julia, I used the line:
julia --pkgimages=no --project -e "include(\"./test/create_sys_image.jl\");"
This should NOT result in an out-of-memory error due to the use of multiple threads…
I’m guessing it forks off multiple subprocesses not threads. Agreed there should be a way to prevent over-forking but I’m not sure how much Julia can figure out the right number on its own.
Well, you can always have a simple, automatic rule like “don’t fork if there is less than 6GB free RAM” and a command line parameter that can override this rule…
True, but you need to 1) be able to determine how much total free RAM there is on the system and 2) be able to estimate how much RAM compiling something will take, which of course in the general case is Turing undecideable without just… doing it.
Not to mention that such a rule would have to be something the individual configures. There are plenty of machines that don’t have 6GB of RAM free at boot (I’ve got a 4GB laptop for example, there are 1GB Raspberry Pis, etc etc).
if you have to configure it, you could as well just use:
systemd-run --user --scope --unit=julia_ramlimit -p MemoryMax=4000M julia
So on all of these machines system image creation should be single threaded by default… Where is the issue? And in Julia you can determine the free RAM easily: Sys.free_physical_memory()/1e9