Julia Execution get out of memory error

I have running julia through terminal in my system. When am try to run machine learning algorithms got out of memory exception.My system have Ram size 16gb.Please explain me about julia memory allocation concept.I have eclose my code here:
function Master()
println(nprocs())
table = readdlm(“diabetics2.csv”,‘,’)
data = convert(DataFrame, table)
features = convert(Array, data[:,1:8])
labels = convert(Array, data[:,9])
procs = nprocs()
i = 1
n,m = size(features)
#println(n,“–”,features[i:Int(n/procs)])
while procs != 0
k = Int(n/procs)
println(k)
data_procs = features[i:k,:]
labels_procs = labels[i:k,1]
i += k
n -= k
procs -= 1
end
println(“Assigning to processes the work…”)
for i in 1:nprocs()
@spawnat i trainLogisticRegression(data_i,label_i)
end
end

Is parallel execution possible with julia? When am running above code my system got struck

This looks like the same code @Devi_Sree posted at Julia run using terminal for 1GB dataset showing out of memory error - #5 by StefanKarpinski. As people in that thread said, it’s hard to help unless you post a fully reproducible example. Perhaps you could place your data file diabetics2.csv on the internet and provide a URL, or provide short piece of code that would generate a random csv file of the same size.

It would also help if you formatted your code by enclosing it in triple backticks, e.g. ``` <code>```.

First, please quote code with triple backticks ```

Then, this is a bit an odd posting as exactly the same question was posted before: Julia run using terminal for 1GB dataset showing out of memory error (in particular the code example is identical). Please do not post on the same topic twice (neither on this forum, nor both on this and on a separate forum such a stack overflow). One potential reason that there was no answer on the other thread, is that there is still no runnable example provided as there is no data-file.

Edit: looks like I am a bit slower than others…