# Question about how to train a neural network using Flux.jl

I am trying to build a neural network that has about 200 Boolean features as inputs and I would like to train it against a regression output (so the `y` is continuous). I tried to setup the network like so

``````using Flux
model = Flux.Chain(
Dense(14*16+4, 64, relu),
Dense(64, 16, relu),
Dense(16, 1, relu));
``````

so I got 3 dense layers all activated by `relu` and now to test my understanding I just ran the `x` through the model

``````x = rand(Bool, 14*16+4)
model(x)
``````

so far so good but the below gives error

``````y = 100
loss(x,y) = (model(x) - y)^2

Flux.train!(loss, [(x,y)])
# Flux.train!(loss, [(x,y)], ADAM(...)) # this didn't work either
``````

and the error is

MethodError: no method matching train!(::typeof(loss), ::Array{Tuple{Array{Bool,2},Array{Float64,1}},1})
Closest candidates are:
train!(::Any, ::Any, !Matched::Any; cb) at /home/jrun/.julia/packages/Flux/UHjNa/src/optimise/train.jl:56

Stacktrace:
 top-level scope at In:1

This is after having browsed through the model zoo and I can’t seem to figure out how to do this simple neural network. I am doing a simple prototype of reinforcement learning, so being able to update the model is also important so would appreciate pointers to `Flux.train!` and `Flux.update!` and how to make them work

`loss(x,y)` needs to return a scalar:

``````using Flux
model = Flux.Chain(
Dense(14*16+4, 64, relu),
Dense(64, 16, relu),
Dense(16, 1, relu));
x = rand(Bool, 14*16+4)
y = 100
loss(x,y) = sum((model(x) .- y).^2)