Confusing dispatch problem

I get an unexpected MethodError when running a piece of code. The relevant bit of the error output is:

ERROR: LoadError: MethodError: no method matching make_my_model(::Int64, ::Int64, ::Int64; no_inchannels=3)
Closest candidates are:
  make_my_model(::Any...; no_inchannels, kwargs...) at ~/3Dto2D/v2/models/model_channel_toggle.jl:40

What I find strange about this is that the method identified as the โ€œclosest candidateโ€ has a signature that should be able to accommodate any invocation I can think of. This is the line in question:

function make_my_model(args...; no_inchannels = 2, kwargs...)

I had imagined that any positional parameters would end up in args, that either the keyword argument no_inchannels would be set when calling (which it was in this case) or it would default to 2 and any other keyword parameters would end up is kwargs.

Why wouldnโ€™t it be compatible with the call make_my_model(1, 4, 7; no_inchannels = 3)?

No world age issues are mentioned in the error messages either. I run julia 1.8.5 on linux.

It should work, and I cannot reproduce on 1.8.5:

julia> function make_my_model(args...; no_inchannels = 2, kwargs...)
         @show args, kwargs
       end
make_my_model (generic function with 1 method)

julia> make_my_model(1,2,3; no_inchannels=3)
(args, kwargs) = ((1, 2, 3), Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}())
((1, 2, 3), Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}())

What does methods(make_my_model) say?

# 1 method for generic function "make_my_model":
[1] make_my_model(args...; no_inchannels, kwargs...) in Main at /home/johjo50/3Dto2D/v2/models/model_channel_toggle.jl:40

Hmm, no idea honestly. Does the error persist in a fresh session?

Could you reproduce this for us with a fully self-contained minimum working example? Here is one way to produce to cryptic MethodError:

julia> function make_my_model(args...; no_inchannels = 2, kwargs...)
         @show args kwargs
         throw(MethodError(make_my_model, args))
       end
make_my_model (generic function with 1 method)

julia> make_my_model(1,2,3; no_inchannels=3)
args = (1, 2, 3)
kwargs = Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}()
ERROR: MethodError: no method matching make_my_model(::Int64, ::Int64, ::Int64)
Closest candidates are:
  make_my_model(::Any...; no_inchannels, kwargs...) at REPL[28]:1
Stacktrace:
 [1] make_my_model(::Int64, ::Vararg{Int64}; no_inchannels::Int64, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
   @ Main ./REPL[28]:3
 [2] top-level scope
   @ REPL[29]:1

I only run it from command line as <SOME ENVIRONMENT VARIABLE DEFINITIONS> julia <some options> code.jl <lots of arguments> > save.the.stdout 2> save.the.stderr. Iโ€™m checking that it still behaves the same.

I reran it twice with some variations and it looks exactly the same. I still havenโ€™t found any good way of running a debugger* on the code so I progress slowly. Iโ€™ll start with putting in some more diagnostic output, I guess.

* You can load Debugger.jl in REPL, set ARGS, set one or more breakpoints in files that will be running and finally do @enter include("code.jl"). You can do that, but it doesnโ€™t work because all of your code will run in some โ€œinner evaluationโ€, not subject to breakpoints and stepping. What you could debug this way, however, is the REPL itself. Unfortunately, I have nothing that I know needs fixing there and if I did I wouldnโ€™t be qualified to do it.

Update (but no epiphanies, so donโ€™t get your hopes up):

I added some diagnostic output immediately before the call to make_my_model and some at the top of the function body. When I run the code again, this snippet (also including the output from methods that @skleinbo asked for) is generated in the output:

# 1 method for generic function "make_my_model":
[1] make_my_model(args...; no_inchannels, kwargs...) in Main at /home/johjo50/3Dto2D/v2/models/model_channel_toggle.jl:40
Before the call make_my_model(arguments...; no_inchannels = channels)
arguments = [5, 7, 8, 9]
channels = 3
Inside make_my_model(args...; no_inchannels = 2, kwargs...)
args = (5, 7, 8, 9)
no_inchannels = 3
kwargs = Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}()

As far as I can see every important aspect is the same (I note that I use some different numbers, but I canโ€™t see why that should matter). All the diagnostic messages look like I expected them, more or less. No other changes made to the code.

Yet, now it manages to find a method: the method that has been there all the time! Is there some explanation to what is happening? Is there a lesson to be learned?

Now that I can progress beyond this I get a load (see below) of output in my stderr because of what I assume is a totally unrelated problem. Is this a fair assumption?

โ”Œ Error: CuDNN (v8302) function cudnnGetConvolutionForwardAlgorithmMaxCount() called:
โ”‚     Info: Traceback contains 85 message(s)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: ptr.isSupported()
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: engine_post_checks(handle, *ebuf.get(), engine.getPerfKnobs(), req_size)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: finalize_internal()
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:35.819594 (0d+0h+2m+0s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Warning: CuDNN (v8302) function cudnnGetConvolutionForwardAlgorithmMaxCount() called:
โ”‚     Info: Traceback contains 29 message(s)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚ Time: 2023-02-01T11:36:39.049080 (0d+0h+2m+4s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:138
โ”Œ Warning: CuDNN (v8302) function cudnnBatchNormalizationForwardTraining() called:
โ”‚     Info: Traceback contains 1 message(s)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: !canRunSemiPersist
โ”‚ Time: 2023-02-01T11:36:45.400175 (0d+0h+2m+10s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:138
โ”Œ Error: CuDNN (v8302) function cudnnGetConvolutionForwardAlgorithmMaxCount() called:
โ”‚     Info: Traceback contains 85 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.612761 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Error: CuDNN (v8302) function cudnnConvolutionForward() called:
โ”‚     Info: Traceback contains 3 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.613024 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Warning: CuDNN (v8302) function cudnnGetConvolutionForwardAlgorithmMaxCount() called:
โ”‚     Info: Traceback contains 65 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚ Time: 2023-02-01T11:36:52.643631 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:138
โ”Œ Error: CuDNN (v8302) function cudnnConvolutionForward() called:
โ”‚     Info: Traceback contains 3 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.643796 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Warning: CuDNN (v8302) function cudnnBatchNormalizationForwardTraining() called:
โ”‚     Info: Traceback contains 1 message(s)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: !canRunSemiPersist
โ”‚ Time: 2023-02-01T11:36:52.643947 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:138
โ”Œ Error: CuDNN (v8302) function cudnnGetConvolutionForwardAlgorithmMaxCount() called:
โ”‚     Info: Traceback contains 94 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.689861 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Error: CuDNN (v8302) function cudnnConvolutionForward() called:
โ”‚     Info: Traceback contains 3 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.690034 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Warning: CuDNN (v8302) function cudnnGetConvolutionForwardAlgorithmMaxCount() called:
โ”‚     Info: Traceback contains 74 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚ Time: 2023-02-01T11:36:52.703894 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:138
โ”Œ Error: CuDNN (v8302) function cudnnConvolutionForward() called:
โ”‚     Info: Traceback contains 3 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.704060 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Warning: CuDNN (v8302) function cudnnBatchNormalizationForwardTraining() called:
โ”‚     Info: Traceback contains 1 message(s)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: !canRunSemiPersist
โ”‚ Time: 2023-02-01T11:36:52.704167 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:138
โ”Œ Error: CuDNN (v8302) function cudnnGetConvolutionForwardAlgorithmMaxCount() called:
โ”‚     Info: Traceback contains 94 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.731834 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Error: CuDNN (v8302) function cudnnConvolutionForward() called:
โ”‚     Info: Traceback contains 3 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.732003 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Warning: CuDNN (v8302) function cudnnGetConvolutionForwardAlgorithmMaxCount() called:
โ”‚     Info: Traceback contains 74 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚ Time: 2023-02-01T11:36:52.748695 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:138
โ”Œ Error: CuDNN (v8302) function cudnnConvolutionForward() called:
โ”‚     Info: Traceback contains 3 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.748856 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Warning: CuDNN (v8302) function cudnnBatchNormalizationForwardTraining() called:
โ”‚     Info: Traceback contains 1 message(s)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: !canRunSemiPersist
โ”‚ Time: 2023-02-01T11:36:52.748991 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:138
โ”Œ Error: CuDNN (v8302) function cudnnGetConvolutionForwardAlgorithmMaxCount() called:
โ”‚     Info: Traceback contains 94 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: false == cudnn::cnn::isForwardSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Warning: CUDNN_STATUS_NOT_SUPPORTED; Reason: T_ENGINEMAP::isLegacyAlgoSupported(handle, xDesc, wDesc, cDesc, yDesc, algo)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.758726 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ”Œ Error: CuDNN (v8302) function cudnnConvolutionForward() called:
โ”‚     Info: Traceback contains 3 message(s)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: dimA[i] <= 0
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: cudnn::ops::setTensorNdDescriptor(desc, dtype, nbDims, dimA, strideA, true)
โ”‚         Error: CUDNN_STATUS_BAD_PARAM; Reason: initStatus = getXDescriptor(conv, &xDescCompat)
โ”‚ Time: 2023-02-01T11:36:52.758896 (0d+0h+2m+17s since start)
โ”‚ Process=50017; Thread=50017; GPU=NULL; Handle=NULL; StreamId=NULL.
โ”” @ CUDA.CUDNN ~/.julia/packages/CUDA/Ey3w2/lib/cudnn/CUDNN.jl:140
โ‹ฎ 

It goes on for another 30 000 lines, but even the output above have been heavily edited to keep up the dramatic tension of the narrative.

I suppose I should enquire about this in some GPU-forum, but should anyone know what this is about, please tell me.

The error messages in my last post seem to be connected to running julia with -g 2.

I am now back to the error cited before. I have also tried to remove the keyword argument from the argument list. The results are very similar. See below:

# 1 method for generic function "make_my_model":
[1] make_my_model(args...; kwargs...) in Main at /home/johjo50/3Dto2D/v2/models/model_channel_toggle.jl:41
Before the call make_my_model(arguments...; no_inchannels = channels)
arguments = [1, 4, 7]
channels = 3
ERROR: LoadError: MethodError: no method matching make_my_model(::Int64, ::Int64, ::Int64; no_inchannels=3)
Closest candidates are:
  make_my_model(::Any...; kwargs...) at ~/3Dto2D/v2/models/model_channel_toggle.jl:41

Had this worked I would just extract no_inchannels from kwargs. As I understand it these two solutions would be equivalent.

I am perplexed by the continuing lack of a minimum working example here.

What is the shortest code.jl and command line invocation that produces this error?

Just make a copy of your code.jl and start deleting lines. As you delete lines, check to see if the problem still exists. Once you have achieved the bare minimum number of lines that produce the problem, give us the contents of code.jl and how to invoke the script.

2 Likes

Donโ€™t be offended. I am incredibly stressed in my work at the moment and canโ€™t even do the minimal example. I hope I am permitted to ask questions anyway. When I have the time I donโ€™t mind spending it on this sort of thing. I even answer other peopleโ€™s questions, without any thought of โ€œwhatโ€™s in it for meโ€. Now, however is the time to finish some articles in a hurry, or the rest wonโ€™t matter much.