I am trying to use the `FluxOptTools`

(GitHub - baggepinnen/FluxOptTools.jl: Use Optim to train Flux models and visualize loss landscapes) to train a Flux model using `Optim`

. I followed the example provided in the readme. It is working as intended. But when I slightly changed it to my case it isn’t working properly. The code is running without errors, but the parameters of the model are not getting updated. Can anyone please help me figure out what I am doing wrong?

Here is my code:

```
using Pkg
Pkg.activate(".")
Pkg.add(["DataFrames", "RDatasets","Flux", "FluxOptTools", "Zygote", "Optim", "LossFunctions"])
using DataFrames
using RDatasets
using Flux, Zygote, Optim, FluxOptTools, Statistics
using LossFunctions
diabetes = dataset("MASS", "Pima.te")
y_df = diabetes[!,:Type] .== "Yes"
X_df = diabetes[!, Not(:Type)]
# Converting X and y into matrices and vectors
y = vec(y_df)'
X = Matrix(Matrix(X_df)')
m = Chain(Dense(7,20),
Dense(20,50),
Dense(50,10),
Dense(10,1,sigmoid)
)
loss() = mean(value(PerceptronLoss(),m(X),y))
Zygote.refresh()
pars = Flux.params(m) # Initializing parameters
initial_par = pars
lossfun, gradfun, fg!, p0 = optfuns(loss, pars)
res = Optim.optimize(Optim.only_fg!(fg!), p0, LBFGS() ,Optim.Options(show_trace=true))
```