Heroku – Memory quota vastly exceeded

Hey there, I developed a Julia application using Dash and ObjectDetector, and deployed on Heroku. But I am having “R15 - Memory quota vastly exceeded” error.

I checked the memory usage on my local computer and the application (running in command line) uses about 1200MB~1600MB. Heroku dynos are quite expensive, for example (2.5GB memory) $250 dyno per month.

I just want to know something:

  • Is 1200MB~1600MB normal for such application? I didn’t compare but never faced with a memory problem while working in Python.
  • Is there a way to reduce memory usage? I tried PackaceCompiler but resulted almost no changes.

That’s all, thank you! :slightly_smiling_face:

P.S. If you wonder, click here for the code.

It might be worth profiling your code to see where the memory usage is coming from.

1 Like

I have also tried to make very minimal Julia applications and in my experience there isn’t really a way to get away with <1GB of RAM. I even tried creating custom sysimages without unused standard libraries and such.


Exactly. Even the application is low-code and simple, uses more than 500MB. I also saw similar comments about memory usage, so it seems normal for the app. Sysimages reduced running time but not memory usage, unfortunately. Thank you @fredrikekre

I have also noticed the high memory usage of some simple julia apps. Maybe it’s worth investing on a small server!

1 Like

To be honest, I’ve never profiled Dash for memory usage. I will do this in the near future ,if I find unjustified excessive memory usage, I will fix it… But most likely the memory is used during JIT compilation. It is possible that using JSON 3 instead of JSON2 in Dash, which I plan to release next week, will slightly improve the situation.

Thank you very much for pointing out the problem. I will try to come up with ways to solve it, but it is with the use of memory that I am not sure that a good solution exists.

1 Like

I have a fly.io server running with Makie.jl and HypothesisTests.jl on a steady 750 MB memory usage. It wasn’t easy to set up though. The trick seemed to be that everything (edit: most things) should be precompiled and included inside the Docker so that the production image doesn’t have to load other things. For example, I also had to explicitly load MKL_jll to include the MKL binary. Without doing that, memory went to 1.6 GB. The Docker is based on Ubuntu. A docker based on Nix needed the same amount of memory. Alpine was an enormous hassle to get working with Makie, so I gave up on that.


Unfortunately, it will not be possible to precompile everything in Dash, because in callbacks the types of arguments depend on the json that comes from the front end. Perhaps this will improve with the transition to JSON3, which is more stable in terms of parsing result types.

No sorry, I mistyped. I meant that a lot should be precompiled, but maybe it only matters for startup time and not memory usage anyway

No, you are absolutely right - JIT compilation requires a significant amount of memory

julia> using DataFrames

julia> @time a = DataFrame(a = randn(1000), b = randn(1000));
  0.096489 seconds (228.65 k allocations: 13.744 MiB, 99.53% compilation time)

julia> @time a = DataFrame(a = randn(1000), b = randn(1000));
  0.000044 seconds (30 allocations: 33.656 KiB)
1 Like

Thank you everyone for the kind answers! I asked 2 questions and got the answers in general. Dash has a great potential for web analytics apps in Julia, so I hope the performance will be better in the long-term.