I’m trying to port a very trivial MC simulation from C++ / Rust to Julia. Here are the 3 implementations:
C++: (clang OSX) https://pastebin.com/HWEUxa6W
Rust: https://pastebin.com/7d31ynJE
Julia: https://pastebin.com/NCxnhWt1
Commands:
time --show-output -M 5 'julia --optimize=3 -O3 trnbias.jl 0 1000 0.01 1000'
(i assume that was jitted already since i ran it a hundred times)
Time (mean ± σ): 116.358 s ± 22.229 s [User: 110.601 s, System: 5.023 s]
Range (min … max): 91.366 s … 150.880 s 5 runs
time --show-output -M 5 'cargo run --release -q --bin trnbias -- 0 1000 0.01 1000'
(pre built so cargo run does the build step but its mainly a noop)
Time (mean ± σ): 70.857 s ± 10.696 s [User: 69.616 s, System: 0.379 s]
Range (min … max): 56.655 s … 83.764 s 5 runs
clang++ -Wall -O3 -g -lcurses -std=c++11 TrnBias.CPP -o bin/TRNBIAS
time --show-output -M 5 'bin/TRNBIAS 0 1000 0.01 1000'
Mean IS=0.0453 OOS=-0.0004 Bias=0.0457 Time (mean ± σ): 43.332 s ± 0.633 s [User: 42.676 s, System: 0.237 s]
Range (min … max): 42.837 s … 44.376 s 5 runs
Both julia and rust have bounds checking and over/under flow checks, C++ does not. I assume Rust and Julia have to basically be on par on this test, what am I missing ? Also there’s quite a few GC triggers and allocations in the Julia code, does it all come from the print statements, thats quite a lot, I cant see what else might allocate. Using hyperfine
to time