Lookup in Dict{Int,Float64} allocates

I was surprised that the following code reports an allocation when looking up a Float64 value from a Dict{Int64,Float64}

using BenchmarkTools

n = 1_000_000
d = Dict(i=>rand() for i in 1:n);
@btime d[42]
# 19.346 ns (1 allocation: 16 bytes)

even more strangely, if I make a Dict{Int64,String} instead (like was done here), I get 0 allocations

using BenchmarkTools
using Random

n = 1_000_000
d = Dict(i=>randstring() for i in 1:n);
@btime d[42]
# 19.346 ns (1 allocation: 16 bytes)

It seems bizzare that the allocations would depend on the value type of the Dict!

This is not actually a performance issue in my code, because I only have a small number of Dict lookups. I just stumbled across these allocations while profiling and was curious why they happen at all. Does anyone know why?

Don’t benchmark in global scope like that. See some of the top tips in Performance Tips · The Julia Language.

To elaborate on what Kristoffer said, you can either interpolate d from global scope:

julia> @btime $(d)[42]
  4.651 ns (0 allocations: 0 bytes)
0.852560373628357

or create a new local d using BenchmarkTools tools:

julia> @btime d[42] setup=(d = Dict(i=>rand() for i in 1:1_000_000))
  4.720 ns (0 allocations: 0 bytes)
0.20694337049842937
4 Likes