# Different inner product if vector type is not set

Hi All,
I would like to ask about the precision of `Float64`s in a Vector of type `Any` vs. a vector of type `Float64`. It seems there is a subtle difference that I noticed as part of a recent project. Here is a minimal example:

``````g1 = Float64[]
g2 = Float64[]
g3 = Float64[]
using Distributions, Random
seeds = 1:10000
for seed β seeds
x = exp.(randn(101))

T = length(x) - 1
y = diff(log.(x))

z = Vector(undef, T)
for i in 2:T+1
z[i-1] = log(x[i]) - log(x[i-1])
end

a = Vector{Float64}(undef, T)
for i in 2:T+1
a[i-1] = log(x[i]) - log(x[i-1])
end

@assert all(z.==y)
@assert all(a.==y)
Random.seed!(seed)
r1 = rand(InverseGamma(0.5 * T, 0.5 * y'*y))
Random.seed!(seed)
r2 = rand(InverseGamma(0.5 * T, 0.5 * z'*z))
Random.seed!(seed)
r3 = rand(InverseGamma(0.5 * T, 0.5 * a'*a))
push!(g1, r1)
push!(g2, r2)
push!(g3, r3)
end

all(g1 .== g2)
all(g1 .== g3)
``````
``````julia> all(g1 .== g2)
false
julia> all(g1 .== g3)
true
``````

So it seems the precision of `Float64`s is different in `Any` arrays. Is there documentation of this?

I can confirm this in master. Simpler example:

``````julia> x=rand(10000); y=Any[s for s in x];

julia> x'x == y'y
false
``````

IIUC, the first one calls BLAS, while the second one is pure julia. No idea if the difference is to be expected though. Maybe it is due to parallelization in BLAS, so operations are done in a different order?

1 Like

Btw, the title is wrong, this is not about random draws.

Thank you. I corrected it.

1 Like

Following @abraunstβs example I investigated this further. Since the inner product is just a sum of squares in this case, the order of computation should not matter (mathematically).

Trying different methods of calculating it I found one that recreates the behavior of a `Float64` vector for an `Any` vector holding `Float64`s. I also discovered the different methods to show slight differences.

``````
using Random
Random.seed!(123);
x = rand(1000);
xval = x'x
xval2 = sum(x.^2)
xval3 = 0.;
for val in x
global xval3
xval3 += val^2
end

``````
``````julia> xval == xval2
true
julia> xval == xval3
false
julia> xval - xval3
-4.547473508864641e-13
``````
``````y = Any[s for s in x];
yval = y'y
yval2 = sum(y.^2)
yval3 = 0
for val in y
global yval3
yval3 += val^2
end
``````
``````julia> yval == yval2
false
julia> yval - yval2
4.547473508864641e-13
julia> yval == yval3
true
julia> yval == xval
false
julia> yval-xval
4.547473508864641e-13
julia> yval2 == xval
true
julia> yval == xval3
true

``````

This is normal and to be expected, as addition of floating point numbers is not associative.

1 Like

I donβt think I understand the meaning of associativity in this case.
E.g.
(2.5 +1.2) + 3.1 = 2.5 + (1.2 + 3.1)

I think you do, but you may be surprised to know that floating point addition is not associative (due to, essentially, rounding).

``````julia> @show (-1e30+1e30)+1.0 -1e30+(1e30+1.0);
(-1.0e30 + 1.0e30) + 1.0 = 1.0
-1.0e30 + (1.0e30 + 1.0) = 0.0
``````
3 Likes

I did not know that. Thank you!

1 Like