Does anyone know how tolerances are computed for vectors in the Base.Test environment? I’m seeing some behavior that I can’t fully explain, see below.
using Test a = rand(10) * 1e-4 @test a ≈ zeros(size(a)) atol=1e-4
Test Failed at REPL:1 Expression: ≈(a, zeros(size(a)), atol=0.0001) Evaluated: [7.34438e-5, 1.48842e-5, 1.86573e-5, 3.26998e-5, 6.13959e-5, 4.73754e-5, 3.91773e-5, 6.14291e-5, 7.24425e-6, 4.73723e-5] ≈ [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] (atol=0.0001)
To me it seems like that test should have passed? Any ideas?