CharNN in Flux zoo


I am running charNN from the Flux Zoo and come across the error in the loss function:

Flux.truncate!() not found. I found out that this function existed in Julia 0.4 and reset the gradients in a model. This function no longer exists and I find no record of it and what one should do. So my question whether Flux takes care of resetting the gradient automatically every time the loss is computed? There is no mention of this in the documentation that I have seen. I am using Julia 1.3 and Flux 0.10.3 . Thank you.

For completeness, I include the entire charNN program.

using Flux
using Flux: onehot, chunk, batchseq, throttle, crossentropy
using StatsBase: wsample
using Base.Iterators: partition


isfile("input.txt") ||

text = collect(String(read("input.txt")))
alphabet = [unique(text)..., '_']
# text: 1D array: each element is vector of size=size(alphabet)[1]
text = map(ch -> onehot(ch, alphabet), text)
stop = onehot('_', alphabet)

N = length(alphabet)
seqlen = 50
# Normally, one specifies the batch size
nbatch = 50  # number of batches

# Xs[1:1830][1:50][68,50] or Xs[1:3659][seqlength][length(alphabet),nbatch]
Xs = collect(partition(batchseq(chunk(text, nbatch), stop), seqlen))
Ys = collect(partition(batchseq(chunk(text[2:end], nbatch), stop), seqlen))

m = Chain(
  LSTM(N, 128),
  LSTM(128, 128),
  Dense(128, N),

m = gpu(m)

function loss(xs, ys)
  l = sum(crossentropy.(m.(gpu.(xs)), gpu.(ys)))
  return l

opt = ADAM(0.01)
tx, ty = (Xs[5], Ys[5])
evalcb = () -> @show loss(tx, ty)

Flux.train!(loss, params(m), zip(Xs, Ys), opt,
            cb = throttle(evalcb, 30))

# Sampling

function sample(m, alphabet, len)
  m = cpu(m)
  buf = IOBuffer()
  c = rand(alphabet)
  for i = 1:len
    write(buf, c)
    c = wsample(alphabet, m(onehot(c, alphabet)).data)
  return String(take!(buf))

sample(m, alphabet, 1000) |> println


try Flux.reset! , this should work the same way.