Writing parallel worker output to .out file in slurm to track progress

I am solving an ODE with different initial conditions using EnsembleDistributed on a cluster. I’d like to track the progress of the ensemble calculations. To do that, I wrote the following output and reduction functions:

@everywhere begin
    #for ensemble problem
    init_vals = $init_vals #40Xtrajectories matrix of initial conditions
    function prob_func(prob, i, repeat)
        println(i)
        flush(stdout)
        remake(prob; u0 = init_vals[i].Y_0, tspan =(init_vals[i].T_i, init_vals[i].T_f))
    end
    function reduction(u,batch,I)
        jldopen("traj_try.jld2", "a+"; compress=true) do file
            name_of_data = "solvec$(I[1])"
            file[name_of_data] = batch
        end
        println("till $(last(I)) written")
        flush(stdout)
        last(I), false
    end
end

This only writes to the ‘.out’ file created with the job when it hits the reduction function. The println statements from the output_fcn are only written after one batch of trajectories are complete.
How can I print the statements in real time to track how many trajectories have been run?
Or is there a better way to track the progress of the ensemble calculation?
Thanks