sob
August 18, 2025, 11:28pm
1
As a dependency of DiffEqGPU
(version 3.8,0), it fails to compile on Julia 1.11.6 (Linux).
(Test) pkg> precompile
Precompiling project...
✗ LinearSolve → LinearSolveCUDAExt
33 dependencies successfully precompiled in 131 seconds. 583 already precompiled.
1 dependency had output during precompilation:
┌ Test
│ ERROR: LoadError: UndefVarError: `LinearVerbosity` not defined in `LinearSolveCUDAExt`
│ Stacktrace:
│ [1] top-level scope
│ @ ~/.julia/packages/LinearSolve/fHamo/ext/LinearSolveCUDAExt.jl:52
│ [2] include
│ @ ./Base.jl:562 [inlined]
│ [3] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt128}}, source::String)
│ @ Base ./loading.jl:2881
│ [4] top-level scope
│ @ stdin:6
│ in expression starting at /home/sob/.julia/packages/LinearSolve/fHamo/ext/LinearSolveCUDAExt.jl:1
│ in expression starting at stdin:6
│ ┌ Error: Error during loading of extension LinearSolveCUDAExt of LinearSolve, use `Base.retry_load_extensions()` to retry.
│ │ exception =
│ │ 1-element ExceptionStack:
│ │ Failed to precompile LinearSolveCUDAExt [e24d4dde-ed20-5ee7-b465-f1dd73f9b6ba] to "/home/sob/.julia/compiled/v1.11/LinearSolveCUDAExt/jl_7Ajqnz".
│ │ Stacktrace:
│ │ [1] error(s::String)
│ │ @ Base ./error.jl:35
│ │ [2] compilecache(pkg::Base.PkgId, path::String, internal_stderr::IO, internal_stdout::IO, keep_loaded_modules::Bool; flags::Cmd, cacheflags::Base.CacheFlags, reasons::Dict{String, Int64}, loadable_exts::Vector{Base.PkgId})
│ │ @ Base ./loading.jl:3174
│ │ [3] (::Base.var"#1110#1111"{Base.PkgId})()
│ │ @ Base ./loading.jl:2579
Anybody getting the same error?
1 Like
I’m getting the same error in two different machines.
Julia Version 1.11.6
Commit 9615af0f269 (2025-07-09 12:58 UTC)
Build Info:
Official https://julialang.org/ release
Platform Info:
OS: Linux (x86_64-linux-gnu)
CPU: 16 × AMD Ryzen 7 1700 Eight-Core Processor
WORD_SIZE: 64
LLVM: libLLVM-16.0.6 (ORCJIT, znver1)
Threads: 1 default, 0 interactive, 1 GC (on 16 virtual cores)
Julia Version 1.11.6
Commit 9615af0f269 (2025-07-09 12:58 UTC)
Build Info:
Official https://julialang.org/ release
Platform Info:
OS: Windows (x86_64-w64-mingw32)
CPU: 8 × 11th Gen Intel(R) Core(TM) i7-1165G7 @ 2.80GHz
WORD_SIZE: 64
LLVM: libLLVM-16.0.6 (ORCJIT, znver1)
Threads: 1 default, 0 interactive, 1 GC (on 8 virtual cores)
1 Like
Sorry, there was a patch where this caused an issue with the CUDA extension. It should not be fixed on LinearSolve v3.34.0
1 Like
sob
August 19, 2025, 9:21pm
4
Works now thank you @ChrisRackauckas
sob
August 19, 2025, 10:12pm
5
Still getting a warning that init_cacheval()
is overwritten though.
Can you share that warning?
sob
August 20, 2025, 3:12pm
7
New empty project upon calling add CUDA DiffEqGPU
(Linux, Julia 1.11.6)
Precompiling project...
? LinearSolve → LinearSolveCUDAExt
1 dependency successfully precompiled in 3 seconds. 227 already precompiled.
1 dependencies failed but may be precompilable after restarting julia
1 dependency had output during precompilation:
┌ LinearSolve → LinearSolveCUDAExt
│ WARNING: Method definition init_cacheval(LinearSolve.CudaOffloadLUFactorization, Any, Any, Any, Any, Any, Int64, Any, Any, LinearSolve.LinearVerbosity{T} where T, LinearSolve.OperatorAssumptions{T} where T) in module LinearSolve at /home/sob/.julia/packages/LinearSolve/507yZ/src/factorization.jl:1213 overwritten in module LinearSolveCUDAExt at /home/sob/.julia/packages/LinearSolve/507yZ/ext/LinearSolveCUDAExt.jl:55.
│ ERROR: Method overwriting is not permitted during Module precompilation. Use `__precompile__(false)` to opt-out of precompilation.
└
(Blah) pkg>
Alright we got our GPU CI machines fixed. Try LinearSolve.jl v3.36.0. That should be good now
sob
August 21, 2025, 12:16pm
9
All clear now thanks @ChrisRackauckas
1 Like