At work, we have a system where I have a single user account, a desktop on my desk, and multiple compute machines I can ssh into. All these share the same, shared, file system. E.g. they use the same .julia folder, if i add a package in one machine they are avaiable when I ssh into another machine. They all share he same config file etc.
However, I am having tons of problems with precompilation. Pretty much every time I start Julia I have to go through long precompilation processes. I have a package with my project, and every time I run using MyProjectPackage, there is precompilation, often without making any actual changes.
One question I had is, if I precompile something on one of these machines, since they have a shared file system, should/should not Julia be precompiled on all machines? Right now separate precompilations on all machines seems very much to be the case (with everything etting stuch if I attempt to run something on two machines separately).
Is this expected behaviour, or are my problems likely due to something else?
you don’t want -Cgeneric. That will limit you to instructions from the 1990s. You probably want -Cx86-64-v3 if all your CPUs are from the past 10 years ago or -Cx86-64-v4 if they all have AVX-512.
There is also the environment variable JULIA_MAX_NUM_PRECOMPILE_FILES, which someone in this forum suggested to set to a higher value (e.g. ~50) if disk space is not a big issue. It does seem to help in my case (heterogeneous cluster machines) to reduce the number of potential (re-)precompilations.