Why hasn't LinearAlgebra been removed from the default sysimage?

\, /, and inv call factorize. The special functions mostly call either schur or eigen, or they use exp which calls gebal and gesv. So basically all of them rely on LAPACK.

To see many of the relevant method definitions, take a look at LinearAlgebra.jl/src/dense.jl at master · JuliaLang/LinearAlgebra.jl · GitHub and LinearAlgebra.jl/src/symmetric.jl at master · JuliaLang/LinearAlgebra.jl · GitHub.

2 Likes

Yes, that’s what I’m saying. They rely on BLAS/LAPACK but they are not BLAS/LAPACK. If the shim covers BLAS/LAPACK the generics are fine.

1 Like

Exactly. My point was that it must cover a fairly substantial part of BLAS/LAPACK, not just matmul. I was implicitly responding to this line:

1 Like

Dead simple doesn’t mean just matmul. It just means the simple textbook implementations for LU, QR, etc.

3 Likes

I think I like that, but it would mean every library that uses some kind of blas related operation would have to test on each platform and benchmark on those but other than that it is fine. We would always let the user use their favourite backend or should library make choice and is it ok to only work on some of them?

1 Like

No, changing BLAS shouldn’t change results unless it’s very ill conditioned, in which case the result is random in the first place.

I don’t understand this question. It doesn’t make sense in the context of BLAS/LAPACK. All are effectively equivalent in results.

Ah yes true, is it the same for performance, if a change makes code with backend X twice faster it should do the same on backend Y ? In this case no need to test regression on each of them

To be a bit more accurate, OpenBLAS is a separate non-julia library, it is not “in the sysimage”. Even if LinearAlgebra is not in the sysimage, it is still be a stdlib, so we would still ship the OpenBLAS library. What could perhaps be said is that we don’t want to load the OpenBLAS library when Julia starts. Add `Libdl.LazyLibrary` by staticfloat · Pull Request #50074 · JuliaLang/julia · GitHub could help with that regardless if LinearAlgebra is in the sysimage or not.

There isn’t really any known plan for what to do with the pirating stdlibs. I don’t know of a good solution right now (that is non-breaking). People are very quick to come up with ideas that have either been discussed before or are not workable for various reasons.

2 Likes

I second this option.

AbstractArray is a general data structure with support for N dimensions. It is a Memory layout.

LinearAlgebra could define its own subtypes for vectors and matrices in a linear space.

Functions defined in Base would be in the scope of general arrays. Functions defined in LinearAlgebra would be mathematical in nature.

If users expect A*B to make sense upon starting Julia, we could recommend the use of appropriate subtypes from LinearAlgebra as opposed to general array types.

Any other solution will lead to long-term maintenance challenges and suboptimal workarounds. The current encapsulation of concepts is not ideal.

For the transition, a clear error message would do the job:

Error: A*B is not defined for general arrays. If you need linear algebra, wrap the arrays in LinearAlgebra.Matrix or LinearAlgebra.Vector
2 Likes

I must say, with my more than limited knowledge of this topic, that as an end user doing a lot of maths with julia, using LinearAlgebra is completely painless to me, while wrapping every array with LinearAlgebra.<...> would be extremely painful.

15 Likes

In most scientific domains I worked on, wrapping arrays into specific matrix types is already a requirement. Most problems have structure leading to Symmetric, Hermitian, Diagonal, tridiagonal, pentadiagonal, matrices with their corresponding Julia types. Specialized algorithms can only be dispatched properly after wrapping the arrays explicitly. It is a necessary pain I would say.

2 Likes

Consider this not as “wrapping” that you need to do every time when doing linear algebra operations, but simply as using the right types for the different purposes. Whenever you need a matrix, create a Matrix – it just doesn’t have to be the same as Array{2} as it is now.

4 Likes

I agree with you, that specific typing is of great interest for computations, but it is not a requirement. Its one of the things I for one like very much about julia, is that you can precisely type things, but you are not obliged to.
And to be very clear I use Symmetric and others whenever I can, but what I meant was that having to prepend a lot of things with LinearAlgebra would still be a pain, but I’m guessing this is a shared feeling anyway, and probably not what you meant with your proposal for an error message.

2 Likes

Changing BLAS can definitely change results by an ULP or two. I’d expect a naive triple for loop to be even worse in many cases. It’s like the difference between a linear sum and a pairwise one.

5 Likes

BLAS libraries don’t do pairwise summation or anything like it, AFAIK, so an n \times n triple-loop matmul would have the same O(\epsilon_{\text{machine}} \sqrt{n}) roundoff error characteristics as the optimized BLAS.dgemm (though of course the errors would be slightly different, in the same way that they are slightly different if you switch BLAS libraries).

If your program has a problem with the roundoff errors changing from BLAS to naive matmul, it will also have a problem with the roundoff errors changing when you switch from x86 to ARM or switch from OpenBLAS to MKL.

In short, as long as the fallback “shim” is a numerically stable algorithm (e.g. triple-loop matmul, partial-pivoted LU, and column-pivoted Householder QR), even if it’s a naive implementation from a performance perspective, I wouldn’t worry about it from an accuracy perspective. (Fortunately, we don’t need eigensolvers or SVD in Base AFAIK, because implementing those accurately and robustly is a lot harder.)

6 Likes

We would if we want to not break code relying on sqrt and exp on matrices working.

julia> @eval LinearAlgebra eigen(::Hermitian{<:Real}) = error("We've decided this is no longer allowed");

julia> sqrt([1 2
             2 4])
ERROR: We've decided this is no longer allowed
Stacktrace:
 [1] error(s::String)
   @ Base ./error.jl:35
 [2] eigen(::Hermitian{Int64, Matrix{Int64}})
   @ LinearAlgebra ./REPL[4]:1
 [3] sqrt(A::Hermitian{Int64, Matrix{Int64}}; rtol::Float64)
   @ LinearAlgebra ~/.julia/juliaup/julia-1.11.3+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/symmetric.jl:813
 [4] sqrt(A::Hermitian{Int64, Matrix{Int64}})
   @ LinearAlgebra ~/.julia/juliaup/julia-1.11.3+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/symmetric.jl:812
 [5] sqrt(A::Matrix{Int64})
   @ LinearAlgebra ~/.julia/juliaup/julia-1.11.3+0.x64.linux.gnu/share/julia/stdlib/v1.11/LinearAlgebra/src/dense.jl:909
 [6] top-level scope
   @ REPL[5]:1
4 Likes

I am not a huge fan of the idea that Julia would default to a bunch of crappy linalg functions until LinearAlgebra is loaded — either give a good function or none at all!

I am not one of those users who protests the need to using LinearAlgebra before A * v, nor who protests the need to using Statistics before mean(x), but I can’t imagine those users wouldn’t also protest the need to using LinearAlgebra before a non-naive A * v. it just seems like the worst of all worlds

15 Likes

Fully agreed. IMO the right way to do this is to not break people’s code, and continue to let LinearAlgebra pirate Base methods if it’s loaded.

Then, we just provide a command-line flag or environment variable that allows people to opt out of loading LinearAlgebra at startup, making these functions hit method-errors or default fallbacks (if they exist).

2 Likes

Oh shoot, I forgot about those. That being said, I think it would be much less breaking if we required LinearAlgebra for those functions methods, which are fairly specialized (and likely most code that needs them is already using LinearAlgebra, I’d guess?).

This would be so breaking that it’s hard to imagine doing this in 1.x (at least, not by default … of course if it’s opt-out that’s a different story, albeit one that may raise many technical issues).

The problem with breaking stuff like this in 1.x is that maybe packages won’t be affected much, but I promise there’s a million scripts out there written by not-so-expert julia programmers that use stuff like trig functions or roots on matrices without loading LinearAlgebra.

These are people which are probably going to be extra confused / upset if their code breaks like this in a 1.x update.

I think it makes a lot more sense to continue to load LinearAlgebra by default, but then create a mechanism to allow users to opt out of it if they wish (e.g. because they need better startup times for some script).

Technically this is already doable with custom sysimages, but I doubt anyone is actually doing that. If we were able to make a commandline flag or environment variable to not load LinearAlgebra, it’d be a lot lower friction and more useful.

12 Likes