XGBoost 'libxgboost' not defined error

Hi I am new to this and unsure if this should be an issue to load in GitHub, but I’m not highly skilled in this level of computing.
I am using Windows 11 Pro and XGBoost.jl (with MLJXGBoostInterface) and it was working okay with Julia 1.8 until I upgraded to 1.9.0.
I have spent days searching reading articles and trying to figure it out including completely uninstalling Julia, removing all packages, reinstalling earlier versions I had used (Julia 1.8 through to 1.9), XGBoost (2.2.5 and 2.3.0), not using MLJ, etc., with no joy.
Here’s my Stacktrace which TBH is not something I really understand. :frowning:

using XGBoost
ERROR: InitError: UndefVarError: libxgboost not defined
Stacktrace:
[1] XGBRegisterLogCallback
@ C:\Users\frank.julia\packages\XGBoost\sa5Xe\src\Lib.jl:56 [inlined]
[2] init()
@ XGBoost C:\Users\frank.julia\packages\XGBoost\sa5Xe\src\XGBoost.jl:41
[3] register_restored_modules(sv::Core.SimpleVector, pkg::Base.PkgId, path::String)
@ Base .\loading.jl:1115
[4] _include_from_serialized(pkg::Base.PkgId, path::String, ocachepath::String, depmods::Vector{Any})
@ Base .\loading.jl:1061
[5] _require_search_from_serialized(pkg::Base.PkgId, sourcepath::String, build_id::UInt128)
@ Base .\loading.jl:1506
[6] _require(pkg::Base.PkgId, env::String)
@ Base .\loading.jl:1783
[7] _require_prelocked(uuidkey::Base.PkgId, env::String)
@ Base .\loading.jl:1660
[8] macro expansion
@ .\loading.jl:1648 [inlined]
[9] macro expansion
@ .\lock.jl:267 [inlined]
[10] require(into::Module, mod::Symbol)
@ Base .\loading.jl:1611
during initialization of module XGBoost

This library is broken on Windows with Julia 1.9 due to a very arcane issue. It should be fixed on Julia nightly, please try that. The fix should be released on 1.9.2 (but I see no tag yet). This is the relevant PR: [CompilerSupportLibraries_jll] Upgrade to v1.0.5 by giordano · Pull Request #50135 · JuliaLang/julia · GitHub

1 Like

I don’t think the issue you’re experiencing is the one mentioned by @simsurace because the symptom would be completely different (a crash at runtime). That’d still be a problem in Julia v1.9.0-1.9.1, but that’s not what you’re reporting.

Can you please show the output of

Base.BinaryPlatforms.HostPlatform()

?

Thanks Mosē; I have two versions installed because I was trying the version I was using oringally along with the latest and neither work.
Output is:

Windows x86_64 {cxxstring_abi=cxx11, julia_version=1.8.0, libgfortran_version=5.0.0}
and
Windows x86_64 {cxxstring_abi=cxx11, julia_version=1.9.1, libgfortran_version=5.0.0}

Thanks Frank

Thanks Simone

I gave this a go and sadly didn’t work. I wasn’t that surprised as I had previously done a complete uninstall & removal of Julia and had reinstalled the original Julia version 1.80) that had worked for me before but now has the same problem regardless of Julia or XGBoost version. :frowning:

@giordano & @simsurace - FYI I have also tried installing libxgboost via conda, using Julia in project environments and coding in Visual Studio, Julia REPL , Pluto and JupyterLab…

Thanks Frank

Uhm, if you ]add XGBoost_jll in the same environment where you already installed XGBoost and the try

using XGBoost_jll
XGBoost_jll.host_platform

what do you get? I’m very confused, I don’t understand why you should get the error you reported at the top of the thread.

Thanks.
Here’s the output, XGBoost_jll successfully added

(@v1.8) pkg> status
Status `C:\Users\frank\.julia\environments\v1.8\Project.toml`
⌃ [add582a8] MLJ v0.19.1
⌃ [54119dfa] MLJXGBoostInterface v0.3.7
⌃ [009559a3] XGBoost v2.2.5
  [a5c6f535] XGBoost_jll v1.7.5+1

using XGBoost is okay.

However, I get the same ‘libxgboost’ error but stacktrace looks as if the problem has moved (for all three Julia):

julia> bst = xgboost((X, y), num_round=5, max_depth=6, objective="reg:squarederror")
ERROR: UndefVarError: libxgboost not defined
Stacktrace:
  [1] XGDMatrixCreateFromMat(data::Matrix{Float32}, nrow::Int64, ncol::Int64, missing::Float32, out::Base.RefValue{Ptr{Nothing}})
    @ XGBoost.Lib C:\Users\frank\.julia\packages\XGBoost\N1xfL\src\Lib.jl:88
  [2] xgbcall(::Function, ::Matrix{Float32}, ::Vararg{Any})
    @ XGBoost.Lib C:\Users\frank\.julia\packages\XGBoost\N1xfL\src\Lib.jl:25
  [3] #_dmatrix#9
    @ C:\Users\frank\.julia\packages\XGBoost\N1xfL\src\dmatrix.jl:183 [inlined]
  [4] DMatrix(x::Matrix{Float64}; kw::Base.Pairs{Symbol, Vector{Float64}, Tuple{Symbol}, NamedTuple{(:label,), Tuple{Vector{Float64}}}})
    @ XGBoost C:\Users\frank\.julia\packages\XGBoost\N1xfL\src\dmatrix.jl:208
  [5] #DMatrix#22
    @ C:\Users\frank\.julia\packages\XGBoost\N1xfL\src\dmatrix.jl:267 [inlined]
  [6] DMatrix
    @ C:\Users\frank\.julia\packages\XGBoost\N1xfL\src\dmatrix.jl:267 [inlined]
  [7] #DMatrix#23
    @ C:\Users\frank\.julia\packages\XGBoost\N1xfL\src\dmatrix.jl:269 [inlined]
  [8] DMatrix
    @ C:\Users\frank\.julia\packages\XGBoost\N1xfL\src\dmatrix.jl:269 [inlined]
  [9] xgboost(::Tuple{Matrix{Float64}, Vector{Float64}}; kw::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol, Symbol}, NamedTuple{(:num_round, :max_depth, :objective), Tuple{Int64, Int64, String}}})
    @ XGBoost C:\Users\frank\.julia\packages\XGBoost\N1xfL\src\booster.jl:457
 [10] top-level scope
    @ REPL[12]:1
 [11] top-level scope
    @ C:\Users\frank\.julia\packages\CUDA\pCcGc\src\initialization.jl:171 ```

You didn’t answer my question :slightly_smiling_face: I’m interested in the output of XGBoost_jll.host_platform in the version of julia/environment where you get the error.

Sorry :face_with_peeking_eye:, I understand now…if in the grey box it’s a command…I’m a bit of a noob.

Windows x86_64 {cuda=12.1, cxxstring_abi=cxx11, julia_version=1.8.0, libgfortran_version=5.0.0}

As before, same for all Julia version.

Ok, thank you very much, that was very useful, now I understand what’s going on.

@tylerjthomas9 I think the problem is that Windows platform is getting cuda tag, which we probably shouldn’t be doing at all.

Thank you for helping provide me a very +ve first time use of Julia Discourse :smiley:

  1. I am assuming a long term fix means a package change, is there any chance you or @tylerjthomas9 can provide me a hack, perhaps in a .jl file or something?

  2. Should I be loading this as an issue in Issues · dmlc/XGBoost.jl (github.com)? (another first ever for me)

In the meantime, I’ll have a go at using something like Colab :thinking:

Thx

Can you try the nightly julia build and see if it works. Also, using XGBoost_jll@1.7.4 should have no cuda tag on the windows build. It was added as part of the refactor to support multiple CUDA versions. I can take a look at it tonight and check.

Apologies for taking so long to reply, my environment was messy with different versions of julia and packages so uninstalled and cleaned it all up first.
YES :star_struck:, your suggestion of overnight build and XGBoost_jll@1.7.4 worked.

Here is what I am using;

(@v1.10) pkg> status
Status `C:\Users\frank\.julia\environments\v1.10\Project.toml`
   [009559a3] XGBoost v2.3.0
⌃ [a5c6f535] XGBoost_jll v1.7.4+0
Info Packages marked with ⌃ have new versions available and may be upgradable.

Tested with example previously mentioned from Home · XGBoost.jl and now my own code is running fine.
Huge relief.
Thanks
Frank

3 Likes

I had the same issue here. I tried the nightly build and XGBoost_jll@1.7.4 and can verify that it works! Thank you very much!!

XGBoost_jll.host_platform:

julia> versioninfo()
Julia Version 1.10.0-DEV.1510
Commit 690a5f67c1 (2023-06-19 19:23 UTC)
Platform Info:
  OS: Windows (x86_64-w64-mingw32)
  CPU: 16 × AMD Ryzen 7 3700X 8-Core Processor
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-15.0.7 (ORCJIT, znver2)
  Threads: 23 on 16 virtual cores
Environment:
  JULIA_NUM_THREADS = auto

(@v1.10) pkg> st
Status `C:\Users\Daymond\.julia\environments\v1.10\Project.toml`
  [009559a3] XGBoost v2.3.0
⌃ [a5c6f535] XGBoost_jll v1.7.4+0
Info Packages marked with ⌃ have new versions available and may be upgradable.

julia> using XGBoost_jll

julia> XGBoost_jll.host_platform
Windows x86_64 {cuda=12.1, cxxstring_abi=cxx11, julia_version=1.10.0, libgfortran_version=5.0.0}

Running XGBoost:

julia> using XGBoost

julia> (X, y) = (randn(100,4), randn(100))
([-0.2623242551305721 -0.7349016916372322 -0.12226910711070484 -1.7262374695698632; 0.8933518985340615 -1.0466514268113587 -1.2625763832961918 0.5165072356427121; … ; 1.0389513791819174 0.5070140641914969 0.250096316841046 0.8411149909580441; -0.06475505384999933 1.1945917760920695 0.683202330789277 0.6970125676128226], [0.26426235528171105, 0.8589844858085923, -0.2937243715578777, -0.45293466098853175, 0.3352454222015448, -0.044283648660174146, -0.9737646611540043, -1.472423862819522, 1.0827540869930354, -0.18105464988479517  …  -0.289517450801958, 0.030952382769496706, 0.34658208038935573, -1.0541588807051363, 0.4672706118266623, -0.2843188714799799, -1.7266824152048368, -0.5690021912345585, -0.8951092383731294, 0.42586554568021784])

julia> bst = xgboost((X, y), num_round=5, max_depth=6, objective="reg:squarederror")
[ Info: XGBoost: starting training.
[ Info: [1]     train-rmse:0.89970133211933134
[ Info: [2]     train-rmse:0.74107146090429576
[ Info: [3]     train-rmse:0.63458998856155557
[ Info: [4]     train-rmse:0.54164823654162086
[ Info: [5]     train-rmse:0.48739732068195674
[ Info: Training rounds complete.
Booster()
2 Likes