Hello,
I encountered this error when using a registered Julia package called “Knet” yet I wasn’t sure whether to post it as an issue for Knet or ask it here. I eventually decided to ask about it here as the error was only reproducible in Atom but the code ran smoothly in the regular Julia REPL.
So here is the problem:
I am using a deep learning package called Knet. When I run the following code at the REPL I get the desired results and expected outputs:
julia> using Knet
julia> x = Param([1,2,3])
3-element Param{Array{Int64,1}}:
1
2
3
julia> y = @diff sum(abs2,x)
T(14)
julia> println(grad(y, x))
[2, 4, 6]
Param
, @diff
and grad
are Knet functions. The documentation regarding them are as follows:
Param(x) returns a struct that acts like x but marks it as a parameter you want to compute
gradients with respect to.
@diff expr evaluates an expression and returns a struct that contains its value (which should be
a scalar) and gradient information.
grad(y, x) returns the gradient of y (output by @diff) with respect to any parameter x::Param,
or nothing if the gradient is 0.
Yet when I try to run the same code line by line using Shift+Enter in Atom first two line run with no problem and the third line (namely y = @diff sum(abs2,x)
) produces the following error:
Julia Client – Internal Error
MethodError: no method matching iterate(::AutoGrad.Node)
Closest candidates are:
iterate(!Matched::Core.SimpleVector) at essentials.jl:578
iterate(!Matched::Core.SimpleVector, !Matched::Any) at essentials.jl:578
iterate(!Matched::ExponentialBackOff) at error.jl:171
What seems to be the problem here?