Broadcast error in subtracting Matrix from Grads(...) object

Hi there ,

I am new to ML in Julia and loving it thus far. I am trying to evaluate a rather complex expression which is average of a gradient and some other stuff as well. I need the value of the final expression to return a Grads(…) object, in order to then update my Params using Flux.update(opt, params, grad).

I am using the following in the code :

Julia Version 1.8.5
Commit 17cfb8e65ea (2023-01-08 06:45 UTC)
Platform Info:
  OS: macOS (x86_64-apple-darwin21.4.0)
  CPU: 16 × Intel(R) Core(TM) i9-9880H CPU @ 2.30GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-13.0.1 (ORCJIT, skylake)
  Threads: 1 on 16 virtual cores
Environment:
  JULIA_EDITOR = code
  JULIA_NUM_THREADS = 

and packages Flux v0.13.17, Zygote Zygote v0.6.62.

The code itself is :

using Zygote,Flux
import LinearAlgebra:tr,Diagonal

struct NeuralAnsatz
    chain::Chain
end
  
function (m::NeuralAnsatz)(x)
   
    return exp.(m.chain(x)).^2
end

Flux.@functor NeuralAnsatz

∇(g::Function,𝐱::Vector)=gradient(𝐱->sum(g(𝐱)),𝐱)
∇²(g::Function,𝐱::Vector)=sum(Diagonal(hessian(𝐱->sum(g(𝐱)),𝐱)))
∇(g::NeuralAnsatz,𝐱::Vector)=gradient(𝐱->sum(g(𝐱)),𝐱)
∇²(g::NeuralAnsatz,𝐱::Vector)=sum(Diagonal(hessian(𝐱->sum(g(𝐱)),𝐱)))

chain = Chain(Dense(1, 1,relu))
Ψ = NeuralAnsatz(chain)

ϕ(𝐱::Vector)=1/2*𝐱[1]^2
Ĥ(𝐱::Vector, ψ::Function)=-∇²(ψ,𝐱)/2 .+ϕ(𝐱)*ψ(𝐱)
ε₀(𝐱::Vector,ψ::Function)=ψ(𝐱).^-1 .*Ĥ(𝐱,ψ) ## ground state energy 

ϕ(𝐱::Vector)=1/2*𝐱[1]^2
Ĥ(𝐱::Vector, ψ::NeuralAnsatz)=-∇²(ψ,𝐱)/2 .+ϕ(𝐱)*ψ(𝐱)
ε₀(𝐱::Vector,ψ::NeuralAnsatz)=ψ(𝐱).^-1 .*Ĥ(𝐱,ψ)
ε₀(𝐱::Float64,ψ::NeuralAnsatz)=ψ(𝐱).^-1 .*Ĥ(𝐱,ψ)

N=10

𝐫=rand(Float64,N)
Θ=Flux.params(Ψ)

log∇0=gradient(()->sum(log.(Ψ([𝐫[1]]))),Θ)
𝒪=[1/Ψ([𝐫[1]]).*log∇0.*Ĥ([𝐫[1]],Ψ),ε₀([𝐫[1]],Ψ),log∇0]

for i in 2:N
    log∇=gradient(()->sum(log.(Ψ([𝐫[i]]))),Θ)
    𝒪[1].+=1/Ψ([𝐫[i]]).*log∇.*Ĥ([𝐫[i]],Ψ) ## return grads objects 
    𝒪[2].+=ε₀([𝐫[i]],Ψ)[1]
    𝒪[3].+=log∇
end 

g= 2.0 .*𝒪[1]  .- 2.0 .* 𝒪[2]./N .* 𝒪[3]./N

Stack trace error is :

ERROR: MethodError: no method matching +(::Float64, ::Matrix{Float64})
For element-wise addition, use broadcasting with dot syntax: scalar .+ array
Closest candidates are:
  +(::Any, ::Any, ::Any, ::Any...) at operators.jl:591
  +(::T, ::T) where T<:Union{Float16, Float32, Float64} at float.jl:383
  +(::Union{Float16, Float32, Float64}, ::BigFloat) at mpfr.jl:414
  ...
Stacktrace:
 [1] map!(::Function, ::Zygote.Grads, ::Zygote.Grads, ::Zygote.Grads)
   @ Zygote ~/.julia/packages/Zygote/JeHtr/src/compiler/interface.jl:372
 [2] map
   @ ~/.julia/packages/Zygote/JeHtr/src/compiler/interface.jl:365 [inlined]
 [3] broadcasted(f::Function, gs::Zygote.Grads, gss::Zygote.Grads)
   @ Zygote ~/.julia/packages/Zygote/JeHtr/src/compiler/interface.jl:348
 [4] top-level scope
   @ REPL[8]:1

I would greatly greatly appreciate any help. The error is happening clearly in truing to subtract the 2.0 .𝒪[1] from the product 𝒪[2]./N . 𝒪[3]./N . The first expression returns a Grads(…) object all good. The second expression seems to return 2-element Vector{Array{Float64}}. No good.

Thanks!