I was using JSON to parse a large (1.6 GB) JSON file, and the memory usage was surprising. More surprisingly, perhaps, was that even after removing the reference to the parsed object and garbage collecting, the memory usage remained high:
using JSON
# write and parse a 25 byte JSON to compile
# Julia memory usage ~200 MB
d = JSON.parsefile("really_big.json")
# takes a long time
# Julia memory usage 12 GB
d = nothing
Base.GC.gc()
# Julia memory usage 8 GB
Is this pattern expected? Can anything be done to mitigate? Thanks!
As far as I know, it is. The main reason we still have JSON around is that we need something in the STDLIB that can parse JSON for package management stuff.