I’m having one more problem: I would like to run the computation for the folders in “parallel” and reading the documentation about channels I see that channels are well suited for such an I/O intensive task.
However, since I have a lot folders, I would like to control the number of “concurrent” tasks. The documentation I link to above have an example where sleeping processes run 4 at a time, but I cannot figure out how to adapt that to my situation.
With an average_image
function like in my first post I have a wrapper that saves the output:
save_average(dir, sz)
files = joinpath.(dir, readdir(dir))
avgimg = average_image(files, sz)
savename = string(dir, ".png")
save(savename, avgimg)
end
Processing all folders at once as explained in this question, which I fear is too agressive:
@sync for dir in dirs
@async save_average(dir, sz)
end