Toy Flux example with one paramemeter not working?

I am trying to write a blog for people with no background in machine learning using the Flux framework. However, I can’t seem to figure out how to get it to work.

using Flux
w = param(rand())
x = 1
y = 0.5

loss(x) = (w*x - y)^2

Flux.train!(loss, params(w), (x,), ADAM())

I thought the above would just work. But I am getting this error

ERROR: MethodError: no method matching copyto!(::Tracker.TrackedReal{Float64}, ::Base.Broadcast.Broadcasted{Tracker.TrackedStyle,Tuple{},typeof(+),Tuple{Base.Broadcast.Broadcasted{Tracker.TrackedStyle,Nothing,typeof(*),Tuple{Float64,Tracker.TrackedReal{Float64}}},Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{0},Nothing,typeof(*),Tuple{Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{0},Nothing,typeof(-),Tuple{Int64,Float64}},Float64}}}})
Closest candidates are:
  copyto!(::AbstractArray, ::Base.Broadcast.Broadcasted) at broadcast.jl:863
  copyto!(::AbstractArray, ::Any) at abstractarray.jl:720
  copyto!(::Any, ::Base.Broadcast.Broadcasted{#s162,Axes,F,Args} where Args<:Tuple where F where Axes where #s162<:StaticArrays.StaticArrayStyle) at C:\Users\RTX2080\.julia\packages\StaticArrays\DBECI\src\broadcast.jl:29
Stacktrace:
 [1] materialize!(::Tracker.TrackedReal{Float64}, ::Base.Broadcast.Broadcasted{Tracker.TrackedStyle,Nothing,typeof(+),Tuple{Base.Broadcast.Broadcasted{Tracker.TrackedStyle,Nothing,typeof(*),Tuple{Float64,Tracker.TrackedReal{Float64}}},Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{0},Nothing,typeof(*),Tuple{Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{0},Nothing,typeof(-),Tuple{Int64,Float64}},Float64}}}}) at .\broadcast.jl:822
 [2] apply!(::ADAM, ::Tracker.TrackedReal{Float64}, ::Float64) at C:\Users\RTX2080\.julia\packages\Flux\dkJUV\src\optimise\optimisers.jl:104
 [3] update!(::ADAM, ::Tracker.TrackedReal{Float64}, ::Tracker.TrackedReal{Float64}) at C:\Users\RTX2080\.julia\packages\Flux\dkJUV\src\optimise\train.jl:6
 [4] update!(::ADAM, ::Tracker.Params, ::Tracker.Grads) at C:\Users\RTX2080\.julia\packages\Flux\dkJUV\src\optimise\train.jl:11
 [5] macro expansion at C:\Users\RTX2080\.julia\packages\Flux\dkJUV\src\optimise\train.jl:74 [inlined]
 [6] macro expansion at C:\Users\RTX2080\.julia\packages\Juno\oLB1d\src\progress.jl:134 [inlined]
 [7] #train!#12(::Flux.Optimise.var"#16#22", ::typeof(Flux.Optimise.train!), ::Function, ::Tracker.Params, ::Tuple{Int64}, ::ADAM) at C:\Users\RTX2080\.julia\packages\Flux\dkJUV\src\optimise\train.jl:69
 [8] train!(::Function, ::Tracker.Params, ::Tuple{Int64}, ::ADAM) at C:\Users\RTX2080\.julia\packages\Flux\dkJUV\src\optimise\train.jl:67
 [9] top-level scope at REPL[3]:clock9:

It works if you make w an array:

w = param(rand(1))
loss(x) = (w[1]*x - y)^2

I think it might not be possible to fix the case of w a number, as train! wants to mutate its contents.

1 Like

I see. It needs to be an array, or else how can I mutate the value?! Thanks for that!