Simple custom recurrent layer in Flux


I am looking into Flux and trying to understand the recurrent functionality. The following code defines a first order low pass filter with Flux.Recur.

using Flux
using Flux.Tracker

c = 0.5

function LP(h, x)
    h = c * x + (1.0 - c) * h
    return h, h

h = 0.
recurLP = Flux.Recur(LP, h)

function model(x)
    out = zeros(1000,1)
    for ii = 1:1000
        out[ii] = recurLP(x[ii])
    return out

function loss(x, y)    
    return sum((model(x) .- y).^2) 

Let’s filter some noise and calculate the loss.

inp = randn(1000,1);
loss(inp, inp)


It seems that the filter is filtering something :wink:
So far so good, now we could optimize the filter coefficient, such that the filter stops filtering, e.g. for c = 1.0 the loss should become 0.
However, neither:

Flux.train!(loss, params(model), [(inp, inp)], ADAM(0.1))


c = param(c)
gs = Tracker.gradient(() -> loss(inp, inp), params(c))

seem to work.

I get error messages
MethodError: no method matching back!(::Float64)
MethodError: no method matching Float64(::Tracker.TrackedReal{Float64}), respectively.

What am I missing?

Thanks in advance :slight_smile:

1 Like

The problem is that your loss is a regular Float64 instead of a Trackedreal number. Through this, Tracker is unable to backpropagate.
Also, I believe mutation of arrays will give difficulties so it’s better to push! to the out array.

function model(x)
    out = []
    for feature in x
        push!(out, recurLP(feature))
    return out