Warning: Linking two modules of different target triples: 'bcloader' ... 'start'

Started getting the warnings

warning: Linking two modules of different target triples: 'bcloader' is 'x86_64-unknown-linux-gnu' whereas 'start' is 'x86_64-linux-gnu'

warning: Linking two modules of different target triples: 'bcloader' is 'x86_64-unknown-linux-gnu' whereas 'start' is 'x86_64-linux-gnu'

 Warning: Using fallback BLAS replacements, performance may be degraded
└ @ Enzyme.Compiler ~/.julia/packages/GPUCompiler/qdoh1/src/utils.jl:50

when running some sciml stuff. Everuthing still seems to work as it should, so it is not really a problem, but I got curious and tried to find where it came from.

Managed to reduce it to this, which seems to error on fresh installations for both 1.8.5 and 1.9.0-beta3

using Lux, SciMLSensitivity, DifferentialEquations
using Optimization, OptimizationOptimisers

using ComponentArrays
using Random

rng = Random.default_rng()

ts = collect(0:0.01:1)
xs = [sin.(ts)'; cos.(ts)']

function df!(dx, x, p, t)
    dx .= Lux.apply(model, x, p.nnps, nnst)[1]

model = Lux.Chain(
    Lux.Dense(2, 32),
    Lux.Dense(32, 2),
nnps, nnst = Lux.setup(rng, model)

ps = ComponentVector{Float64}(; nnps)
prob_f = ODEProblem(df!, xs[:, 1], (ts[begin], ts[end]), ps)

function loss(ps, _)
    _prob = remake(prob_f, u0 = xs[:, 1], tspan=(ts[1], ts[end]), p=ps)
    xhat = Array(solve(_prob, saveat=ts))
    sum(abs2, xs .- xhat)

optf = Optimization.OptimizationFunction(loss, Optimization.AutoZygote())
optprob1 = Optimization.OptimizationProblem(optf, ps)
res1 = Optimization.solve(optprob1, ADAM(0.01), maxiters = 5)

I have tried to reduce it more, but several different things seem to make the error go away and I haven’t really managed to pin it down to why.

  • If I make the Lux network a single layer, no warning.
  • If I use AutoFowardDiff, no warning.
  • If I skip DifferentialEquations and and replace the loss function with the below code, no warning.
function loss(ps, _)
    xhat = Lux.apply(model, xs, ps.nnps, nnst)[1]
    sum(abs2, xs .- xhat)

I was thinking it was Enzyme since that is mentioned in part of the warning, but I’m unsure exactly how enzyme is tied in to this since I don’t make direct use of it and specifically use Zygote for AD.

Found some other mentions of similar warnings

  • Optim github exactly the same warning, gets told it might be Enzyme.
  • Enzyme github linked from the optim issue. It also seems like it should have been fixed like 6 months ago.
  • discourse which is exactly the same set of warnings, though the discussion is about an error that seems unrelated to it…

I have already spent more time to try to figure this out than I really have at the moment, so I thought I would put it here for now in case anyone knew anything or wanted to take a look at it.

Some of the SciML stuff (including ODEProblem/SciMLSensitivity) try to use Enzyme on the inside by default for performance reasons.

In any case, you can ignore the warning (though the BLAS FYI may be relevant for performance – which we have on our radar for doing efficiently soon).

1 Like