We experienced an issue with Matrix-Vector (or row-vector times column-vector) product.
In the example the result should return zero:
a = [1.3190624946305824, 1.3190624946305824]
b = [-3.74 3.74]
# Matrix-Vector-Product
b * a # not always zero
sum(b .* a) # zero
On Machine 1, only the second calculation returns zero:
julia> b * a
1-element Vector{Float64}:
-3.583133801120832e-16
julia> sum(b .* a)
0.0
On Machine 2 both are zero:
julia> b * a
1-element Vector{Float64}:
0.0
julia> sum(b .* a)
0.0
However, doing the same calculation in Matlab returns zero also on Machine 1. Given that matlab uses MKL we expected an issue there but using MKL.jl does not change the results.
Is the difference the expected inaccuracy due to Floating Point calculation?
Machine 1:
julia> versioninfo()
Julia Version 1.7.2
Commit bf53498635 (2022-02-06 15:21 UTC)
Platform Info:
OS: Windows (x86_64-w64-mingw32)
CPU: Intel(R) Core(TM) i7-8565U CPU @ 1.80GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-12.0.1 (ORCJIT, skylake)
julia> BLAS.get_config()
LinearAlgebra.BLAS.LBTConfig
Libraries:
ā [ILP64] libopenblas64_.dll
Machine 2:
julia> versioninfo()
Julia Version 1.7.2
Commit bf53498635 (2022-02-06 15:21 UTC)
Platform Info:
OS: Windows (x86_64-w64-mingw32)
CPU: 11th Gen Intel(R) Core(TM) i7-1165G7 @ 2.80GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-12.0.1 (ORCJIT, tigerlake)
julia> BLAS.get_config()
LinearAlgebra.BLAS.LBTConfig
Libraries:
ā [ILP64] libopenblas64_.dll