# Very basic Flux problem

I’m starting to use flux for an NN task, but I’m having trouble understanding an error message. The following is not my use case, but maybe if I understand what’s going wrong here I can fix my actual problem…

``````
using Flux

model = Dense(21, 10, sigmoid)
loss_f(x,y) = Flux.mse(model(x), y)

inp_data = rand(21,)
out_labs = rand(1:4, 1)

loss_f(inp_data, out_labs) # this works, e.g., I get a number

Flux.train!(loss_f, Flux.params(model), zip(inp_data, out_labs), opt)
``````

at the last line I get:

``````ERROR: MethodError: no method matching (::Dense{typeof(σ),TrackedArray{…,Array{Float32,2}},TrackedArray{…,Array{Float32,1}}})(::Float64)
Closest candidates are:
Dense(::AbstractArray{T<:Union{Float32, Float64},N} where N) where {T<:Union{Float32, Float64}, W<:(AbstractArray{T<:Union{Float32, Float64},N} where N)} at /Users/austinbean/.julia/packages/Flux/qXNjB/src/layers/basic.jl:110
Dense(::AbstractArray{#s104,N} where N where #s104<:Real) where {T<:Union{Float32, Float64}, W<:(AbstractArray{T<:Union{Float32, Float64},N} where N)} at /Users/austinbean/.julia/packages/Flux/qXNjB/src/layers/basic.jl:113
Dense(::AbstractArray) at /Users/austinbean/.julia/packages/Flux/qXNjB/src/layers/basic.jl:98
Stacktrace:
[1] loss_f(::Float64, ::Int64) at ./none:1
[2] #14 at /Users/austinbean/.julia/packages/Flux/qXNjB/src/optimise/train.jl:72 [inlined]
[3] gradient_(::getfield(Flux.Optimise, Symbol("##14#20")){typeof(loss_f)}, ::Tracker.Params) at /Users/austinbean/.julia/packages/Tracker/RRYy6/src/back.jl:97
[4] #gradient#24(::Bool, ::Function, ::Function, ::Tracker.Params) at /Users/austinbean/.julia/packages/Tracker/RRYy6/src/back.jl:164
[5] gradient at /Users/austinbean/.julia/packages/Tracker/RRYy6/src/back.jl:164 [inlined]
[6] macro expansion at /Users/austinbean/.julia/packages/Flux/qXNjB/src/optimise/train.jl:71 [inlined]
[7] macro expansion at /Users/austinbean/.julia/packages/Juno/TfNYn/src/progress.jl:124 [inlined]
[8] #train!#12(::getfield(Flux.Optimise, Symbol("##16#22")), ::Function, ::Function, ::Tracker.Params, ::Base.Iterators.Zip{Tuple{Array{Float64,1},Array{Int64,1}}}, ::ADAM) at /Users/austinbean/.julia/packages/Flux/qXNjB/src/optimise/train.jl:69
[9] train!(::Function, ::Tracker.Params, ::Base.Iterators.Zip{Tuple{Array{Float64,1},Array{Int64,1}}}, ::ADAM) at /Users/austinbean/.julia/packages/Flux/qXNjB/src/optimise/train.jl:67
[10] top-level scope at none:0
``````

I confess that I don’t really know what Flux is looking for in the `data` field in `train!()` - the docs are a little terse there. But this may have absolutely nothing to do with the problem. Thanks for your help with such a basic question.

I don’t have a Julia installation on my work computer, unfortunately, so I can’t totally verify this but I think I see what’s going on. Try changing the `train!` call to this and see if it works:

``````Flux.train!(loss_f, Flux.params(model), [(inp_data, out_labs)], opt)
``````

When you `zip(inp_data, out_labs)`, you’re having it basically train with the individual floats from the input and output vectors (notice that the error message is complaining that it doesn’t have a method for `Dense` that accepts a single Float64 as input). You can see in the definition of train! that it just iterates over the data and splats each example in the loss function input, pretty much like this:

``````for d in data
loss(d...)
end
``````

In other words, if your data looks like this (and your loss function is of the form `loss(x, y)`):

`data = [(x1, y1), (x2, y2), ...]`

then you can pass it in as-is:

`Flux.train!(loss, params(model), data, opt)`

I’m guessing that you picked up the `zip` from an example where you had something like:

``````xs = [x1, x2, x3, ...]
ys = [y1, y2, y3, ...]
``````

In that case, you can zip them up like this:

`Flux.train!(loss, params(model), zip(xs, ys), opt)`

Hope that helps!