I found a useful little hack so I share it here. It may already be discovered elsewhere, but it’s fun to re-discover a nice(?) trick!
If you ever tried to write a julia script and put its options in the shebang, you might notice that #!/usr/bin/env does not work well with options (depending on OS). Of course, you can write a small wrapper in a separate file. But if you want to do it in one file, you can also do:
#!/bin/bash
# -*- mode: julia -*-
#=
exec julia --color=yes --startup-file=no "${BASH_SOURCE[0]}" "$@"
=#
Base.banner() # put any Julia code here
Anything between the first #= and exec will be executed as the normal bash command. You can do anything there, like setting up environment variables.
IMO it should go in Getting started somewhere, since that is the section that talks about running Julia. Could probably be renamed to “Running Julia” too.
When I run sbatch xxx.jl file in this way, include function will not find the file path by refering to the xxx.jl file path.
Is this a bug?
Here is the slurm output:
SystemError: opening file "/tmp/slurmd/QuantumOpticsExtra.jl": No such file or directory
The example file tree is this?
DC_1D
├── figdata
│ └── d_loop_data.jl # file sbatched
├── functions.jl # file needed to include
└── QuantumOpticsExtra.jl # file needed to include
example xxx.jl
#!/bin/bash
#SBATCH -N 1
#SBATCH --ntasks-per-node=4
#SBATCH -J test
#SBATCH --cpus-per-task=9
#SBATCH -p work
#SBATCH --output=slurm/slurm-%x-%j.out
#SBATCH --error=slurm/slurm-%x-%j.out
#=
exec julia -t$SLURM_CPUS_PER_TASK --project=. --color=yes --startup-file=no "${BASH_SOURCE[0]}" "$@"
=#
using Distributed, SlurmClusterManager
addprocs(SlurmManager(),exeflags=["--project=.", "-t$(ENV["SLURM_CPUS_PER_TASK"])", "--color=yes", "--startup-file=no"])
@everywhere include("../QuantumOpticsExtra.jl")
@Lightup1 I think you’re running into the fact that slurm actually copies your script to a temporary directory when you submit the job. So any files relative to it when you submitted will not be in the same relative place when it runs. I haven’t found an elegant way to deal with this, but this ugly way works. Put something like this immediately above the exec command:
export SLURM_ORIGINAL_COMMAND=$(scontrol show job $SLURM_JOBID | grep "^ Command=" | head -n 1 | cut -d "=" -f 2-)
Then in the julia section, you can figure out the directory like this:
THIS_FILE = if "SLURM_ORIGINAL_COMMAND" ∈ keys(ENV)
split(lstrip(ENV["SLURM_ORIGINAL_COMMAND"]), " ")[1]
else
abspath(@__FILE__)
end
The second case is there in case you want to run the script directly, without slurm.