I am pretty much new to julia, and I am writing a code to train a neural network. I have loaded in my training dataset, but I find that the learning process takes too long time. When I timed my loss function, it takes 6 seconds
function loss2()
err = Float64(0.0)
for ics = 1:n_cases
for ipt = 1:n_pts
err += sum((NN2([xyz[ipt,ics,:]..., uparams[ics,:]..., wa[ics]])
.- pp[ipt,ics,:]).^2)
end
end
return err
end
@time loss2()
6.858366 seconds (42.79M allocations: 1.672 GiB, 4.22% gc time)
I believe that this is the reason why my training is very slow (it’s in the order of days!)
xyz is a 576x1296x3 array
uparams is a 1296x9 array
wa is a 1296 length vector
pp is a 576x1296x5 array.
n_pts = 576
n_cases = 1296
NN2 is defined as follows:
NN2 = Flux.Chain(
Flux.Dense(13, 16, sigmoid),
Flux.Dense(16, 5),
y->abs.(y))
Could you please help me with suggestions to reduce the allocations within the loss functions, or general speed improvement tips?