Stripping OpenBLAS from private build of Julia?

It’s well known that BLAS buffers are the largest source of RAM usage by Julia at startup. Would this problem be gone if the stdlib LinearAlgebra is not loaded at startup? I understand that changing this publicly may be problematic for backward compatibility etc., but would it be easy to strip off OpenBLAS in a private build of Julia for internal consumption?

1 Like

Yeah this is totally doable (as long as you don’t need matrix multiplication or other BLAS stuff). If you are doing this, you also may be able to remove libgfortran which will save another ~10mb.

6 Likes

It should be easy, and long-term what I want.

I’ve excised LinearAlgebra privately from the sysimage, to see it it could be done. I tested and e.g. matrix multiply did still work, since even before you do using LinearAlgebra some of it is working. And I think that part still used OpenBLAS. There’s also generic matmul, and it alone could be used, so OpenBLAS at least needs not be in, keeping full API compatibility, though not full speed.

You can see what I did here (and dropping LinearAlgebra is also ok, if you don’t need it, since this the REPL and Pkg has been excised):

As I said matmul would still work even with OpenBLAS gone (the generic version isn’t used for the float types while it’s still loaded, but if not loaded it should be used). It should work without any changes, but I see there’s till some work to make it better:

I and others would welcome OpenBLAS gone. Many replace it anyway at runtime for e.g. MKL.jl (proprietary non-MIT). Another option only replacing the BLAS, not LAPACK part, of OpenBlas would be BLISBLAS.jl (MIT).

You can already compile without some or all stdlibs, e.g. without LinearAlgebra:

By default, all standard libraries are included in the sysimage. It is possible to only include those standard libraries that the project needs. This is done by passing the keyword argument filter_stdlibs=true to create_app. This causes the sysimage to be smaller, and possibly load faster. The reason this is not the default is that it is possible to “accidentally” depend on a standard library without it being reflected in the Project file. For example, it is possible to call rand() from a package without depending on Random, even though that is where the method is defined. If Random was excluded from the sysimage that call would then error. As another example, matrix multiplication, rand(3,3) * rand(3,3) requires both the standard libraries LinearAlgebra and Random This is because these standard libraries practice “type-piracy”, just loading those packages can cause code to change behavior.

Nevertheless, the option is there to use. Just make sure to properly test the app with the resulting sysimage.

It’s doing similar to what I was doing, but with tools as is, no need to change Julia. But I think it only gets rid of LinearAlgebra, not OpenBLAS. You can delete it’s .so (and also LLVM’s), but I find it likely your app would fail. That does NOT happen if you delete LLVM’s, until you try to use it, even implicitly… while you often can do without it.

2 Likes

What’s the line in the code that actually allocates buffers for BLAS? If I delete the shared library file for libopenblas64, I get an error message at julia startup, but there is no reduction in the startup memory usage. In the __init__() function of LinearAlgebra.jl and OpenBLAS_JLL.jl, I also don’t see any obvious allocation.

It’s also possible the cause is not OpenBLAS? I know Julia reserves about 256 MB always for LLVM, if I recall (but actually much more since it calculates some fraction of physical memory, if I recall). I can dig in and find that line, but is this really a problem?

Julia does just allocate a lot of memory but if it’s not used, it’s just virtual memory, not costing you really.

You CAN get rid of the LLVM .so for (some) compiled apps, though I think Julia might still allocate memory for it, assuming it needs it used. It’s not a problem I think, it’s not like it’s just reserved for it, the rest of the code could use it.

[I seem to recall Julia allocated for OpenBLAS, but it may have been deferred until it was actually used.]