Conflict between Turing and CUDA/Flux?

I’m encountering an issue with CUDA.jl when adding Turing.jl as the latter seems to downgrade the version of the former:

(@v1.7) pkg> add Turing
   Resolving package versions...
    Updating `~/.julia/environments/v1.7/Project.toml`
  [052768ef] ↓ CUDA v3.4.1 ⇒ v2.6.3
  [31c24e10] ↓ Distributions v0.25.37 ⇒ v0.24.18
  [587475ba] ↓ Flux v0.12.8 ⇒ v0.12.1
  [916415d5] ↓ Images v0.25.0 ⇒ v0.24.1
  [929cbde3] ↓ LLVM v4.7.0 ⇒ v3.9.0
  [fce5fe82] + Turing v0.15.1
    Updating `~/.julia/environments/v1.7/Manifest.toml`
  [80f14c24] + AbstractMCMC v2.5.0
  [7a57a42e] + AbstractPPL v0.1.4
  [0bf59076] + AdvancedHMC v0.2.27
  [5b7e9947] + AdvancedMH v0.5.9
  [b5ca4192] + AdvancedVI v0.1.3
  [dce04be8] + ArgCheck v2.1.0
  [ec485272] - ArnoldiMethod v0.2.0
  [4fba245c] ↓ ArrayInterface v3.2.1 ⇒ v2.14.17
  [198e06fe] + BangBang v0.3.32
  [9718e550] + Baselet v0.1.1
  [6e4b80f9] + BenchmarkTools v0.7.0
  [76274a88] + Bijectors v0.8.13
  [b99e7846] + BinaryProvider v0.5.10
  [052768ef] ↓ CUDA v3.4.1 ⇒ v2.6.3

After adding Turing.jl precompile throws the following error:

Precompiling project...
  ✗ CUDA
  ✗ Flux
  0 dependencies successfully precompiled in 8 seconds (209 already precompiled)

ERROR: The following 2 direct dependencies failed to precompile:

Flux [587475ba-b771-5e3f-ad9e-33799f191a9c]

Failed to precompile Flux [587475ba-b771-5e3f-ad9e-33799f191a9c] to /Users/patrickaltmeyer/.julia/compiled/v1.7/Flux/jl_yv9bLk.
ERROR: LoadError: InitError: could not load symbol "LLVMExtraInitializeAllTargets":
dlsym(RTLD_DEFAULT, LLVMExtraInitializeAllTargets): symbol not found
Stacktrace:
  [1] LLVMInitializeAllTargets
    @ ~/.julia/packages/LLVM/srSVa/lib/libLLVM_extra.jl:10 [inlined]
  [2] InitializeAllTargets
    @ ~/.julia/packages/LLVM/srSVa/src/init.jl:58 [inlined]
  [3] __init__()
    @ GPUCompiler ~/.julia/packages/GPUCompiler/XwWPj/src/GPUCompiler.jl:50
  [4] _include_from_serialized(path::String, depmods::Vector{Any})
    @ Base ./loading.jl:768
  [5] _require_search_from_serialized(pkg::Base.PkgId, sourcepath::String)
    @ Base ./loading.jl:854
  [6] _require(pkg::Base.PkgId)
    @ Base ./loading.jl:1097
  [7] require(uuidkey::Base.PkgId)
    @ Base ./loading.jl:1013
  [8] require(into::Module, mod::Symbol)
    @ Base ./loading.jl:997
  [9] include
    @ ./Base.jl:418 [inlined]
 [10] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt64}}, source::String)
    @ Base ./loading.jl:1318
 [11] top-level scope
    @ none:1
 [12] eval
    @ ./boot.jl:373 [inlined]
 [13] eval(x::Expr)
    @ Base.MainInclude ./client.jl:453
 [14] top-level scope
    @ none:1
during initialization of module GPUCompiler
in expression starting at /Users/patrickaltmeyer/.julia/packages/CUDA/M4jkK/src/CUDA.jl:1
ERROR: LoadError: Failed to precompile CUDA [052768ef-5323-5732-b1bb-66c8b64840ba] to /Users/patrickaltmeyer/.julia/compiled/v1.7/CUDA/jl_CdfK3s.
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:33
  [2] compilecache(pkg::Base.PkgId, path::String, internal_stderr::IO, internal_stdout::IO, ignore_loaded_modules::Bool)
    @ Base ./loading.jl:1466
  [3] compilecache(pkg::Base.PkgId, path::String)
    @ Base ./loading.jl:1410
  [4] _require(pkg::Base.PkgId)
    @ Base ./loading.jl:1120
  [5] require(uuidkey::Base.PkgId)
    @ Base ./loading.jl:1013
  [6] require(into::Module, mod::Symbol)
    @ Base ./loading.jl:997
  [7] include
    @ ./Base.jl:418 [inlined]
  [8] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt64}}, source::Nothing)
    @ Base ./loading.jl:1318
  [9] top-level scope
    @ none:1
 [10] eval
    @ ./boot.jl:373 [inlined]
 [11] eval(x::Expr)
    @ Base.MainInclude ./client.jl:453
 [12] top-level scope
    @ none:1
in expression starting at /Users/patrickaltmeyer/.julia/packages/Flux/qp1gc/src/Flux.jl:1

CUDA [052768ef-5323-5732-b1bb-66c8b64840ba]

Failed to precompile CUDA [052768ef-5323-5732-b1bb-66c8b64840ba] to /Users/patrickaltmeyer/.julia/compiled/v1.7/CUDA/jl_2dru7W.
ERROR: LoadError: InitError: could not load symbol "LLVMExtraInitializeAllTargets":
dlsym(RTLD_DEFAULT, LLVMExtraInitializeAllTargets): symbol not found
Stacktrace:
  [1] LLVMInitializeAllTargets
    @ ~/.julia/packages/LLVM/srSVa/lib/libLLVM_extra.jl:10 [inlined]
  [2] InitializeAllTargets
    @ ~/.julia/packages/LLVM/srSVa/src/init.jl:58 [inlined]
  [3] __init__()
    @ GPUCompiler ~/.julia/packages/GPUCompiler/XwWPj/src/GPUCompiler.jl:50
  [4] _include_from_serialized(path::String, depmods::Vector{Any})
    @ Base ./loading.jl:768
  [5] _require_search_from_serialized(pkg::Base.PkgId, sourcepath::String)
    @ Base ./loading.jl:854
  [6] _require(pkg::Base.PkgId)
    @ Base ./loading.jl:1097
  [7] require(uuidkey::Base.PkgId)
    @ Base ./loading.jl:1013
  [8] require(into::Module, mod::Symbol)
    @ Base ./loading.jl:997
  [9] include
    @ ./Base.jl:418 [inlined]
 [10] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt64}}, source::Nothing)
    @ Base ./loading.jl:1318
 [11] top-level scope
    @ none:1
 [12] eval
    @ ./boot.jl:373 [inlined]
 [13] eval(x::Expr)
    @ Base.MainInclude ./client.jl:453
 [14] top-level scope
    @ none:1
during initialization of module GPUCompiler
in expression starting at /Users/patrickaltmeyer/.julia/packages/CUDA/M4jkK/src/CUDA.jl:1

A similar issue was mentioned here. Grateful for any advice!

Thanks

There is no conflict other than Turing doesn’t work on 1.7 yet:

To see:

(@v1.6) pkg>  activate --temp

(jl_SBpaANL) pkg> add CUDA Flux Turing
   Updating `...\Temp\jl_SBpANL\Project.toml`
[052768ef] + CUDA v3.5.0
[587475ba] + Flux v0.12.8
[fce5fe82] + Turing v0.19.2
(....)
  106 dependencies successfully precompiled in 156 seconds (39 already precompiled) 

Ah, my bad! Thanks for the quick response :slight_smile: