Build a Build deep neural network classifier using a long short-term memory (LSTM) architecture

Good day,

Please help me. I recently started with Julia and i am trying to use it to build a deep neural network classifier using a long short-term memory (LSTM) architecture. I manage to load my data, did text preprocessing and tokenization, i build my vocabulary and embeddings, load my embeddings, create embedding matrix and covert my text to sequence of indices. I am now stuck at building a model. When i am trying to implement a training loop for a neural network using Flux.jl, i am getting just revolving errors. when i change this one, i get another one, and i am now stuck at building the loop. Kindy help me. my code i below this text. I thank you so much for help.

    Flux.reset!(model)
    
    train_correct = 0
    train_total = 0
    
    # Create batches iterator
    batches = eachbatch((train_X, train_y_onehot), size=batch_size)
    
    for (x, y) in batches
        # 1. Forward pass and accuracy calculation
        ŷ = model(x)
        train_correct += sum(onecold(ŷ) .== onecold(y))
        train_total += size(y, 2)
        
        # 2. Compute gradient
        grads = gradient(model) do m
            ŷ = m(x)
            return Flux.crossentropy(ŷ, y)
        end
        
        # 3. Optimizer update
        opt_state, ps = Optimisers.update(opt, ps, grads[1], opt_state)
    end
    
    # Calculate epoch statistics
    train_acc = train_correct / train_total
    println("Epoch $epoch: Train Accuracy = $train_acc")
end```

and below is the error that i am getting:

```MethodError: no method matching iterate(::Optimisers.Adam)

The function `iterate` exists, but no method is defined for this combination of argument types.



Closest candidates are:

  iterate(::Base.MethodSpecializations)

   @ Base reflection.jl:1299

  iterate(::Base.MethodSpecializations, ::Nothing)

   @ Base reflection.jl:1305

  iterate(::Base.MethodSpecializations, ::Int64)

   @ Base reflection.jl:1306

  ...

Stack trace
Here is what happened, the most recent locations are first:

_zip_iterate_some
from 
iterators.jl:444
_zip_iterate_all
from 
iterators.jl:436
iterate(z::Base.Iterators.Zip{…}) ...show types...
from 
julia → iterators.jl:426
 
foreach(::Function, ::Optimisers.Adam, ::@NamedTuple{…}, ::@NamedTuple{…}, ::Vararg{…}) ...show types...
from 
julia → abstractarray.jl:3188
 
foreachvalue(::Function, ::Optimisers.Adam, ::@NamedTuple{…}, ::Vararg{…}) ...show types...
from 
Optimisers → utils.jl:10
 
_grads!(::IdDict{…}, ::Optimisers.Adam, ::@NamedTuple{…}, ::@NamedTuple{…}, ::Vararg{…}) ...show types...
from 
Optimisers → interface.jl:118
 
update!(tree::Optimisers.Adam, model::@NamedTuple{…}, grad::@NamedTuple{…}, higher::@NamedTuple{…}) ...show types...
from 
Optimisers → interface.jl:74
 
update(tree::Optimisers.Adam, model::@NamedTuple{…}, grad::@NamedTuple{…}, higher::@NamedTuple{…}) ...show types...
from 
Optimisers → interface.jl:67
 
from 
This cell: line 23
        # 3. Optimizer update
        opt_state, ps = Optimisers.update(opt, ps, grads[1], opt_state)
    end```

The Julia version i am using is 1.11.4, 

Thank you for your help.

Hard to say what is wrong without seeing more code, but that call to update on line 23 as mentioned in the error message has an extra argument compared to the documentation. Did you try

opt_state, ps = Optimisers.update(opt_state, ps, grads[1])

For that to work, you probably need to change whatever code created opt to create opt_state.

I just did that but i am getting same error.

opt_state, ps = Optimisers.update(opt, ps, grads[1])

Okay, thanks. I will try again.

ps should be replaced by model in the update call. Make sure you have an updated Flux version and read the corresponding documentation.