Julia on remote RedHat cluster

Dear all,

in order to conduct numerical calculations I have been granted access to a remote high-performance cluster running on Redhat linux. So far, I can run a batch script that calls a simple julia file with:


which returns:


["/users/pa19/lanast/.julia/environments/v1.3/Project.toml", “/users/apps/compilers/julia/1.3.1/share/julia/stdlib/v1.3”]

The admins advised that certain variables must be set, in order to be able to fetch and run all the Julia packages:
export PATH=$JULIAROOT/bin:$JULIAROOT/tools;
export INCLUDE=$JULIAROOT/include;

Now, do I have to type in these commands in the terminal? Or do I have to place them in side an .sh or an .jl file? And do I need to include the “export” command?

Thank you, all the best

Customizations and how jobs can be run should be documented for your cluster, it is not something that is specific to Julia.

Perhaps talk to the sysadmins, or a user who is experienced using this cluster and has a setup to share.

I agree with @Tamas_Papp talk to your friendly sysadmins. Promis to bring them cookies or beer when the current fuss is over.

Many HPC clusters use the ‘modules’ environment to set these variables.
type ‘module avail’ to see if there is a Julia module. Also ask the sysadmins about modules.

I would put those environment variables at the start of your job script if there is no module available.

Something else to ask your cluster admins.
Which filesystem should you use for storing data and programs.
Normally the /home directory is on a smaller, low performance filesystem.
There probably will be a large capacity high performance filesystem.