Hello all,
I have some csv files and I want to read and process theme using a Julia script by running a bash script on a cluster. For each csv, I want an independent job on the cluster.
Is there a smart way of doing that?
Many thanks!
Hello all,
I have some csv files and I want to read and process theme using a Julia script by running a bash script on a cluster. For each csv, I want an independent job on the cluster.
Is there a smart way of doing that?
Many thanks!
Precisely the example we give in this guide:
Amazing! Thanks a lot!
@juliohm, I would like to do the same as your example but I need to paralleled the for loop in the main.jl file as well. In your example I think it does one loop iteration on one node? In my case I would like to have 4 nodes per loop iteration. Any idea on how to do this? Many thanks!
@99dB the Distributed package creates parallel Julia processes that can run on any number of nodes. The example simply runs one iteration in any process that can live anywhere.
Ok great! Thanks a lot!