I tried a simple example of parsing a multi-part HTTP request (JSON config + some CSV data). The request is about 5MB big. Doing just
const r = deserialize("httpRequestStored") # just to get the request that was serialized for convenience earlier function testParsing(request) HTTP.parse_multipart_form(request) end @benchmark testParsing($r) BenchmarkTools.Trial: memory estimate: 584.20 MiB allocs estimate: 4785904 -------------- minimum time: 190.401 ms (6.29% GC) median time: 194.841 ms (6.66% GC) mean time: 196.294 ms (6.67% GC) maximum time: 213.147 ms (5.62% GC) -------------- samples: 26 evals/sample: 1
results in about 500MB of allocations. This is quite a lot for just parsing 5MB worth of data. Any idea what is wrong / how to fix this? I assume that I am doing benchmarking right.
I am on Julia v1.5.4 and HTTP v0.9.7.