FLUX.JL -- MethodError: no method matching loss()

I have trained CNNs using PyTorch, and I am trying to run a train loop on a simple CNN in Flux for binary (cat/dog) classification, but I cannot for the life of me figure out how to get the loss functions to work in Flux. I have looked around and copied what others did, but to no avail. It has to be something in the way I am using DataLoader, I just know it. But it could also be a collection of issues, so I am going to link the GitHub repo. The images are loading in properly, so I don’t think it’s that, but it could be how I am manipulating and treating them within the code.

Thank you for any help!

train!(loss, params, data, opt; cb)

For each datapoint d in data , compute the gradient of loss with respect to params through backpropagation and call the optimizer opt .

If d is a tuple of arguments to loss call loss(d...) , else call loss(d) .

Docs of Flux.Optimise.train!

So I think you have to call train!(x -> loss(x...), ...) instead.

MethodError: no method matching loss(::NamedTuple{(:images, :labels), 
                              Tuple{Vector{Matrix{Gray{Float64}}}, 
                              Vector{Flux.OneHotArray{UInt32, 2, 0, 1, UInt32}}}})
Closest candidates are:
  loss(::Any, ::Any) at In[19]:2

Since the error says that it is trying to call loss on a (named)tuple, I would just change the definition of loss function from
loss(x, y) = Flux.Losses.logitcrossentropy(model(x), y)
to
loss((x, y)) = Flux.Losses.logitcrossentropy(model(x), y)

1 Like

Even this should do the same, but loss((x, y)) = ... looks better.

1 Like

I believe I had a similar issue with Flux v0.13 trying to call

loss(m, x, y) = mean(Flux.crossentropy(m(x), y))
Flux.@epochs 10 Flux.train!(loss, m, batches, opt_state)

Changing loss(m, x, y) to loss(m, (x, y)) did the trick.