Method not found error

Hi, I’m new to Julia. I can’t follow what’s going wrong here.
When I run the following code in REPL, it is able to run and create a chain of convolution layers:

A = Chain([Flux.Conv((3,3), 1=>3, stride=1, pad=2, bias=false) for _ in 1:3])

However, when I include it in a function in a file called net.jl, in the following way:

using LinearAlgebra, Flux 

function init_filters(K = 3, #num unrollings
                    M = 64, #num filters in each filter bank 
                    P = 7, #square filter size, 
                    s = 1, #stride of convolutions
                    C = 1,  #num input channels
                    t₀ = 0 #threshold
                    )

        A = Chain([Flux.Conv((P,P), C=>M, stride=s, pad=(P-1)//2, bias=false) for _ in 1:K])
        B = Chain([Flux.ConvTranspose((P,P), M=>C, stride=s, pad=(P-1)//2, bias=false) for _ in 1:K])

        W = randn(M,C, P, P) 
        τ = t₀ * ones(K, 2, M, 1, 1)

        for k in 1:K
            A[k].weight = W
            B[k].weight = W
        end

        return A, B, τ
end
init_filters()

gives the following error:

ERROR: MethodError: no method matching expand(::Val{4}, ::Rational{Int64})

Closest candidates are:
  expand(::Any, ::Tuple)
   @ Flux ~/.julia/packages/Flux/UsEXa/src/layers/conv.jl:6
  expand(::Any, ::Integer)
   @ Flux ~/.julia/packages/Flux/UsEXa/src/layers/conv.jl:7

Stacktrace:
 [1] calc_padding(lt::Type, pad::Rational{Int64}, k::Tuple{Int64, Int64}, dilation::Tuple{Int64, Int64}, stride::Tuple{Int64, Int64})
   @ Flux ~/.julia/packages/Flux/UsEXa/src/layers/conv.jl:48
 [2] Conv(w::Array{Float32, 4}, b::Bool, σ::Function; stride::Int64, pad::Rational{Int64}, dilation::Int64, groups::Int64)
   @ Flux ~/.julia/packages/Flux/UsEXa/src/layers/conv.jl:158
 [3] Conv(k::Tuple{Int64, Int64}, ch::Pair{Int64, Int64}, σ::typeof(identity); init::Function, stride::Int64, pad::Rational{Int64}, dilation::Int64, groups::Int64, bias::Bool)
   @ Flux ~/.julia/packages/Flux/UsEXa/src/layers/conv.jl:168
 [4] (::var"#22#24"{Int64, Int64, Int64, Int64})(#unused#::Int64)
   @ Main ./none:0
 [5] iterate
   @ ./generator.jl:47 [inlined]
 [6] collect
   @ ./array.jl:782 [inlined]
 [7] init_filters(K::Int64, M::Int64, P::Int64, s::Int64, C::Int64, t₀::Int64)
   @ Main ~/Documents/Untitled Folder/CDLNet/net.jl:24
 [8] init_filters()
   @ Main ~/Documents/Untitled Folder/CDLNet/net.jl:23
 [9] top-level scope
   @ ~/Documents/Untitled Folder/CDLNet/net.jl:68

PS: Updating Flux did not help.

Hi!
Could you provide a complete MWE (minimum working example) with all the variables and package imports necessary? This will be helpful for debugging

The problem is that your padding in not an Integer:

A = Chain([Flux.Conv((3,3), 1=>3, stride=1, pad=(7-1)//2, bias=false) for _ in 1:3])

fails with the same error. Instead of constructing a rational – as // does (check the docs via ?// in the REPL) – you probably want to use integer division, i.e., div(7 - 1, 2) or (7 - 1) ÷ 2 [÷ can be entered as \div<tab>].

3 Likes

Thanks a lot this solved the problem!