ADAM optimizer error in Flux

Hi!

I am defining a neural network as:

n_hidden = 20
model = Chain(Dense(n_inputs, n_hidden, relu),
              Dense(n_hidden, n_outputs, identity), softmax)
L(x,y) = Flux.crossentropy(model(x), y)
ps = Flux.params(model)
opt = ADAM(η = 0.001, β = (0.9, 0.999))

I am getting the following error:

MethodError: no method matching ADAM(; η=0.001, β=(0.9, 0.999))
Closest candidates are:
ADAM() at /home/subhankar/.julia/packages/Flux/8XpDt/src/optimise/optimisers.jl:99 got unsupported keyword arguments “η”, “β”
ADAM(!Matched::Float64, !Matched::Tuple{Float64,Float64}, !Matched::IdDict) at /home/subhankar/.julia/packages/Flux/8XpDt/src/optimise/optimisers.jl:94 got unsupported keyword arguments “η”, “β”
ADAM(!Matched::Union{Params, AbstractArray}) at /home/subhankar/.julia/packages/Flux/8XpDt/src/optimise/deprecations.jl:46 got unsupported keyword arguments “η”, “β”

Stacktrace:
[1] top-level scope at In[36]:6

I am following the syntax mentioned in the docs here.

Can you tell me where I am doing the mistake?

Thanks!

There are actually positional arguments.

opt = ADAM(0.001, (0.9, 0.999))

source