`]test Package` errors, but `include(test_script)` succeeds

When testing a package (FlashWeave.jl) via ]test FlashWeave, I’m getting errors related to internal matrix computations (only reliant on SparseArrays.jl), but if I instead manually include(<test_script>) the same test script and environment (default v1.9 env), it runs without errors, even though I’m using the same julia executable and package (in .julia/dev).

Curiously, if I make changes to FlashWeave and ]precompile, a subsequent include(<test_script>) runs as expected without precompilation, but ]test FlashWeave triggers a new round of precompilation (before running and then producing the error above).

Any idea what could be causing this? I have a hunch that this is related to the testing environment, even though package versions should be identical to my default v1.9 environment if I understand correctly.

Julia version:

julia> versioninfo()
Julia Version 1.9.1
Commit 147bdf428cd (2023-06-07 08:27 UTC)
Platform Info:
  OS: macOS (x86_64-apple-darwin22.4.0)
  CPU: 16 × Intel(R) Core(TM) i9-9880H CPU @ 2.30GHz
  LIBM: libopenlibm
  LLVM: libLLVM-14.0.6 (ORCJIT, skylake)
  Threads: 1 on 16 virtual cores

I think it would be useful if you included outputs and error messages in the description of the problem. Without this, it’s hard to tell what the issue is.

As far as I understand, ] test creates a new environment with possibly new versions.

If you do include("runtests.jl") it will use the packages and versions of the activated env.

To speed-up ]test I always use

1 Like

Thanks everyone for the responses, sorry for not providing more detail. I think I figured it out in the meantime: the code in question indeed had a bounds error, but is wrapped in @inbounds. It seems using the ]test workflow somehow activated boundschecks, resulting in the error, while include resulted in a silent out-of-bounds access that didn’t lead to any issues (in this case). Removing @inbounds and/or fixing the bug results in consistent behaviour.

For reference, this was the function in question:

function contingency_table_2d_optim!(X::Int, Y::Int, data::SparseArrays.AbstractSparseMatrixCSC{<:Integer}, test_obj::MiTest{<:Integer, Nz})
    # Initialize a 3x3 zero matrix to hold the contingency table
    fill!(test_obj.ctab, 0)
    rvs = rowvals(data)
    nzvs = nonzeros(data)

    # Get the pointers to the start and end of the non-zero elements in each column
    ptr_X, ptr_Y = data.colptr[X], data.colptr[Y]
    ptr_X_end, ptr_Y_end = data.colptr[X + 1], data.colptr[Y + 1]
    row_X, row_Y = rvs[ptr_X], rvs[ptr_Y]

    # While there are non-zero elements remaining in either column
    @inbounds while ptr_X < ptr_X_end && ptr_Y < ptr_Y_end        
        if row_X == row_Y
            val_X, val_Y = nzvs[ptr_X] + 1, nzvs[ptr_Y] + 1
            ptr_X += 1
            ptr_Y += 1
            row_X, row_Y = rvs[ptr_X], rvs[ptr_Y] # bounds error occurred here
            test_obj.ctab[val_X, val_Y] += 1
        elseif row_X < row_Y
            ptr_X += 1
            row_X = rvs[ptr_X]
            ptr_Y += 1
            row_Y = rvs[ptr_Y]

    return nothing
1 Like