Cannot understand the results from sizeof and @allocated

I think this still misses some Array metadata:

julia> Base.summarysize(Vector{Any}(undef, 0))
40

julia> Base.summarysize(Vector{Any}(undef, 2)) # +16 bytes = 2 64-bit pointers
56

julia> Base.summarysize(Any[1, 1.0]) # +16 bytes = 2 8-byte numbers
72

But there should also be some indication of the type of each element in order to properly interpret the 8-byte values. Might be flagged in the otherwise unused bits of the pointer’s virtual address, but that seems strange for an element type with an unlimited number of subtypes. I think boxed type information isn’t counted generally:

julia> Base.summarysize(Ref{Any}(1)) # 8 byte pointer, 8 byte value
16

julia> Base.summarysize(Core.Box(1)) # internal analog
16

Base.summarysize also intentionally doesn’t count some things:

julia> Base.summarysize(Int)
124

julia> Base.summarysize(DataType[Int,Int]) # base 40 + 2 8-byte pointers
56

I took this example from here arguing that it wouldn’t be accurate to add the size of the Int type for each element, but the vector’s report weirdly doesn’t count the one Int type at all.

It’s also worth pointing out that @allocated doesn’t plainly measure expected memory usage. In every example in my comment, replacing Base.summarysize with @allocated reports 0. Not sure why, I presume the compiler just figured the values were entirely discarded and just optimized it all away, which wasn’t possible for a+b.

1 Like