So I have just installed julia 1.10.2 on a mac (M1) and I try running

A = [1 2 3;4 5 6;7 8 9]
inv(A)

and I expect a singular error. However, I actually get a matrix as an output. What is weirder is that if I do the same on a Windows or a Linux machine running arch, I do indeed get an error. Is this something to do with Mac or is there a problem with how I am running my code? For reference, I installed julia 1.6.4 on mac and the same code does in fact throw an error.

Determining if a matrix is singular or not is an inherently unstable computation. Different orders of operations of cut-offs on the condition number will give different results.

julia> using AppleAccelerate
julia> A = [1 2 3; 4 5 6; 7 8 9]
3Ă—3 Matrix{Int64}:
1 2 3
4 5 6
7 8 9
julia> using LinearAlgebra
julia> cond(A) # condition number
5.015946803014265e16
julia> inv(A) # an unstable approximation to the inverse
3Ă—3 Matrix{Float64}:
3.15252e15 -6.30504e15 3.15252e15
-6.30504e15 1.26101e16 -6.30504e15
3.15252e15 -6.30504e15 3.15252e15
julia> versioninfo()
Julia Version 1.10.2
Commit bd47eca2c8a (2024-03-01 10:14 UTC)
Build Info:
Official https://julialang.org/ release
Platform Info:
OS: macOS (arm64-apple-darwin22.4.0)
CPU: 8 Ă— Apple M1 Pro
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-15.0.7 (ORCJIT, apple-m1)
Threads: 6 default, 0 interactive, 3 GC (on 6 virtual cores)
Environment:
JULIA_EDITOR = code

This matrix is singular in theory but not computationally singular in the tolerances used here.

I check eps() on all devices and they yield the same value. Wouldnâ€™t I expect a different eps() across devices if the tolerance was different? Or how could I change the default tolerance for the inv function?

I believe that eps() is just based on the floating point representation and essentially all architectures use the IEEE Floating Point Standard and would return the same value. The differences could show up in BLAS implementations because floating point arithmetic is not associative and the order of operations can affect the result.

It is not so much the inv operation as the LU factorization that determines if the matrix is deemed singular

The TLDR is that whether inv succeeds or fails depends on the implementation of the LU factorization which depends on BLAS library and the vector width of your CPU (and possibly other things).

Yes but why is this the case across different machines? All running the same version of Julia? I think this should be fixed across different machines right?

As @Oscar_Smith said, the result of a floating point calculation can vary from machine to machine depending on the order in which operations are performed. A likely culprit here is the vector width of the CPU floating point registers. I think the Apple M processorsâ€™ registers are 128 bits whereas many Intel processors are 256 bits. If the LU factorization takes advantage of the full width of the registers then different processors will evaluate the expressions in the factorization in a different order with potentially different results. Generally the differences would be negligible but for singularity it is a matter of â€śreally very, very small in magnitudeâ€ť versus zero, as @stevengj implied.