I’m starting to use flux for an NN task, but I’m having trouble understanding an error message. The following is not my use case, but maybe if I understand what’s going wrong here I can fix my actual problem…
using Flux
model = Dense(21, 10, sigmoid)
loss_f(x,y) = Flux.mse(model(x), y)
opt = ADAM(0.1)
inp_data = rand(21,)
out_labs = rand(1:4, 1)
loss_f(inp_data, out_labs) # this works, e.g., I get a number
Flux.train!(loss_f, Flux.params(model), zip(inp_data, out_labs), opt)
at the last line I get:
ERROR: MethodError: no method matching (::Dense{typeof(σ),TrackedArray{…,Array{Float32,2}},TrackedArray{…,Array{Float32,1}}})(::Float64)
Closest candidates are:
Dense(::AbstractArray{T<:Union{Float32, Float64},N} where N) where {T<:Union{Float32, Float64}, W<:(AbstractArray{T<:Union{Float32, Float64},N} where N)} at /Users/austinbean/.julia/packages/Flux/qXNjB/src/layers/basic.jl:110
Dense(::AbstractArray{#s104,N} where N where #s104<:Real) where {T<:Union{Float32, Float64}, W<:(AbstractArray{T<:Union{Float32, Float64},N} where N)} at /Users/austinbean/.julia/packages/Flux/qXNjB/src/layers/basic.jl:113
Dense(::AbstractArray) at /Users/austinbean/.julia/packages/Flux/qXNjB/src/layers/basic.jl:98
Stacktrace:
[1] loss_f(::Float64, ::Int64) at ./none:1
[2] #14 at /Users/austinbean/.julia/packages/Flux/qXNjB/src/optimise/train.jl:72 [inlined]
[3] gradient_(::getfield(Flux.Optimise, Symbol("##14#20")){typeof(loss_f)}, ::Tracker.Params) at /Users/austinbean/.julia/packages/Tracker/RRYy6/src/back.jl:97
[4] #gradient#24(::Bool, ::Function, ::Function, ::Tracker.Params) at /Users/austinbean/.julia/packages/Tracker/RRYy6/src/back.jl:164
[5] gradient at /Users/austinbean/.julia/packages/Tracker/RRYy6/src/back.jl:164 [inlined]
[6] macro expansion at /Users/austinbean/.julia/packages/Flux/qXNjB/src/optimise/train.jl:71 [inlined]
[7] macro expansion at /Users/austinbean/.julia/packages/Juno/TfNYn/src/progress.jl:124 [inlined]
[8] #train!#12(::getfield(Flux.Optimise, Symbol("##16#22")), ::Function, ::Function, ::Tracker.Params, ::Base.Iterators.Zip{Tuple{Array{Float64,1},Array{Int64,1}}}, ::ADAM) at /Users/austinbean/.julia/packages/Flux/qXNjB/src/optimise/train.jl:69
[9] train!(::Function, ::Tracker.Params, ::Base.Iterators.Zip{Tuple{Array{Float64,1},Array{Int64,1}}}, ::ADAM) at /Users/austinbean/.julia/packages/Flux/qXNjB/src/optimise/train.jl:67
[10] top-level scope at none:0
I confess that I don’t really know what Flux is looking for in the data
field in train!()
- the docs are a little terse there. But this may have absolutely nothing to do with the problem. Thanks for your help with such a basic question.