Is it possible to make `Normal(mu, sigma)` work for when `mu` and `sigma` are tracked arrays?

I am trying to make some code where the parameters defined the distribution. This is done in variational auto-encoders (VAE). But I am doing something simpler, I am just trying to implement a maximum likelihood estimator (MLE) using differentiable programming (DP).

# define negative log likelihood
negloglik(q, mu, sigma)  = begin
  dn = Normal(mu, sigma)
  -sum(logpdf.(dn, q))

# define the parameters to track
mu_hat = param(rand(1))
sigma_hat = param(rand(1))

# some random data
q = rand(Normal(0,1), 100)

negloglik(q, 0, 1) # this works
negloglik(q, mu_hat, sigma_hat) # this doesn't

negloglik(q) = negloglik(q, mu_hat, sigma_hat)

The error is

ERROR: MethodError: no method matching Normal(::TrackedArray{.,Array{Float64,1}}
, ::TrackedArray{.,Array{Float64,1}})
 [1] negloglik(::Array{Float64,1}, ::TrackedArray{.,Array{Float64,1}}, ::TrackedArray{.
,Array{Float64,1}}) at .\REPL[35]:1
 [2] fn(::Array{Float64,1}) at .\REPL[40]:1
 [3] top-level scope at none:0

so clearly Normal isn’t defined for tracked arrays. I can easily definied Normal, which I do

Normal(mu::TrackedArray, sigma::TrackedArray) = Normal.(,

# define helper
negloglik(q) = negloglik(q, mu_hat, sigma_hat)

# train
Flux.train!(negloglik, [mu_hat, sigma_hat], q, ADAM())

now I get this error

ERROR: MethodError: no method matching back!(::Float64)
Closest candidates are:
  back!(::Any, ::Any; once) at C:\Users\L098905\.julia\packages\Tracker\6wcYJ\sr
  back!(::Tracker.TrackedReal; once) at C:\Users\L098905\.julia\packages\Tracker
  back!(::TrackedArray) at C:\Users\L098905\.julia\packages\Tracker\6wcYJ\src\li
 [1] gradient_(::getfield(Flux.Optimise, Symbol("##15#21")){typeof(negloglik),Fl
oat64}, ::Tracker.Params) at C:\Users\L098905\.julia\packages\Tracker\6wcYJ\src\
 [2] #gradient#24(::Bool, ::Function, ::Function, ::Tracker.Params) at C:\Users\
 [3] gradient at C:\Users\L098905\.julia\packages\Tracker\6wcYJ\src\back.jl:164
 [4] macro expansion at C:\Users\L098905\.julia\packages\Flux\zNlBL\src\optimise
\train.jl:71 [inlined]
 [5] macro expansion at C:\Users\L098905\.julia\packages\Juno\B1s6e\src\progress
.jl:133 [inlined]
 [6] #train!#12(::getfield(Flux.Optimise, Symbol("##16#22")), ::Function, ::Func
tion, ::Array{TrackedArray{.,Array{Float64,1}},1}, ::Array{Float64,1}, ::ADAM) a
t C:\Users\L098905\.julia\packages\Flux\zNlBL\src\optimise\train.jl:69
 [7] train!(::Function, ::Array{TrackedArray{.,Array{Float64,1}},1}, ::Array{Flo
at64,1}, ::ADAM) at C:\Users\L098905\.julia\packages\Flux\zNlBL\src\optimise\tra
 [8] top-level scope at none:0

so how do I go about defining a back!, I am a bit stump at the momemt. I might be missing some theory, so would appreciate any links to the doc or blogs?

My understanding is that the back! should return the \frac{\partial l}{\partial a} given \frac{\partial l}{\partial f} where f is a function of a i.e. f=g(a) for some function g. But I just can’t figure out how to do this for a parameters into a distribution.