I’m attempting to do something very simple for now. I just want to do a nonlinear curve fitting of sine using Flux.jl. So far, the model is completely refusing to converge, and is giving some very strange results. I must be doing something wrong because I’ve done this before and it worked very well.
using Flux, Plots, Statistics
timespan = 0:0.5:4*pi
out_dat = sin.(timespan)
plot(timespan,out_dat)
hidden = 5
dat = [([x],y) for (x,y) in zip(timespan,out_dat)]
model = Flux.Chain(
Flux.Dense(1 => hidden,relu),
Flux.Dense(hidden => 1))
opt_state = Flux.setup(Adam(), model)
loss(mod,x,y) = Flux.Losses.mse(mod(x), y)
mean([loss(model,x...) for x in dat])
meanerr = 100
i = 0
while meanerr > 0.1
i = i+1
Flux.train!(loss, model, dat, opt_state)
if i%10 == 0
println(i)
meanerr = mean([loss(model,x...) for x in dat])
println(meanerr)
end
end
NNresult = vcat(model.([[t] for t in timespan])...)
plot(NNresult, seriestype = :scatter)
As you can see the problem is very simple. But when I try to run the code, even after literally thousands of epochs the convergence is terrible. For example after 6000 epochs I get the following output
I’ve tried it with all kinds of different settings, activation functions, and number of hidden neurons. It’s probably something simple I’m missing, so if anyone is able to spot anything wrong I would be very grateful. Thanks