CSV : problem to write big dataframes

Hey,

This is a niche problem that not many would face. But I am facing this same problem. I was using CSV.jl to save dataframes and it was going fine. My scientific simulation has huge dataframes. (A Million rows) I was able to save because each element was not that big. But my latest simulation made a dataframe that was 3 GB in the RAM. CSV.write just was not able to save it and I waited for a couple of hours.

i have already optimised my data and I need all of it to do the next step.

I was wondering if there is any way to speed this up or any new library that would be able to reduce my time.

I do know the alternative way would be to write the row/data as it is created. But that would involve me writing the code again and I would like to have a shorter way.

Thanks.