In order to evaluate Julia for the server side I ran a few tests. I hope you enjoy reading and do comment.
Thank you for the test
related and maybe some reader will be interested.
As I see: Julia is missing from the popular https://www.techempower.com/benchmarks/ results.
see the current languages:
And your test is similar than TechEmpower:"Test type 6: Plaintext"
1. The recommended URI is /plaintext .
2. The response content type must be set to
3. The response body must be
Hello, World! .
11. The request handler will be exercised at 256, 1024, 4096, and 16,384 concurrency.
so maybe it is a good candidate for FrameworkBenchmarks
Does anybody know if HTTP.jl’s server is multithreaded already or the numbers are just for a single core?
And here I am making an online climate model with a Julia back end, and worrying about what would happen if I got 5 users at once. I’m such a noob.
Of course just sending “hello world” isn’t doing much so 5000 concurrent requests of that isn’t that impressive. I’ve implemented a simple recommender system and I’ve load tested it and it could also handle a high load (can’t remember the exact numbers) while actually doing some real work (mostly indexing big matrices and sorting the resulting column).
I see you do an explicit @async but I’m not sure if that is actually needed here for async processing since it’s all io anyway and that is handled (I think) via non-blocking io by libuv. My webserver (via HttpServer.jl) just did that and it scaled just fine.
Nice analysis! It would be great to try to figure out what’s causing the performance degradation and improve it. Getting to 6.8k connections without degradation is a good start.
@quinnj seems pretty close to figuring that out
It’s not multi-threaded, nor does it support any kind of multi-threading at the moment, due to Julia’s limited multi-threading abilities. It does, however, support multi-core serving, so multiple julia processes can all listen on the same port and process requests simultaneously (via the
reuseaddr=true keyword argument, currently limited to unix systems).
An exciting PR to watch is https://github.com/JuliaLang/julia/pull/22631, which will allow Julia’s native
Tasks to run concurrently through the use of threads. It’s not entirely clear how the resulting interaction w/ IO (via libuv’s event loop) will play out, but hopefully this would allow automatic better concurrency since HTTP.jl already handles each incoming request in an
So if we start 4 services on 4 cores with
reuseaddr=true (or load balancer) the a least in theory we get 6.8K * 4 = 27.2K? Sounds good to me
True, sending in a static page doesn’t do much. The intention is to test Julia as a forwarding web server and compare it to popular alternatives out there so that anyone who wants a web front end to something that is already written in Julia need not bother with a non Julia server. Well, at least until it hits 5k concurrent connections.
Thanks for the link!
Have added to my todo list.
The current Julia model is to multiplex m tasks on 1 thread (m x 1 model). I have my hopes pinned on 22631 that will bring in (apart from the parfor interface) an m x n model.
The @async is simply a way to get back the Julia prompt . I think HTTP.server.serve() internally spawns a task for each connection, and if I’m not wrong HttpServer.jl does the same.