Can't differentiate foreigncall expression

Trying to use Flux’s automatic differentiation to solve a toy PINN problem below but keep getting the following error and its traceback. When using manually differentiated field term it works fine but not with auto-diff.
Here is the error traceback:
“”"
Can’t differentiate foreigncall expression
error(::String) at error.jl:33
get at iddict.jl:87 [inlined]
(::typeof(∂(get)))(::Nothing) at interface2.jl:0
accum_global at lib.jl:56 [inlined]
(::typeof(∂(accum_global)))(::Nothing) at interface2.jl:0
#89 at lib.jl:67 [inlined]
(::typeof(∂(λ)))(::Nothing) at interface2.jl:0
#1550#back at adjoint.jl:59 [inlined]
(::typeof(∂(λ)))(::Nothing) at interface2.jl:0
getindex at tuple.jl:24 [inlined]
gradindex at reverse.jl:12 [inlined]
(::typeof(∂(λ)))(::Tuple{Nothing,Float64}) at interface2.jl:0
#41 at interface.jl:40 [inlined]
(::typeof(∂(λ)))(::Tuple{Float64}) at interface2.jl:0
gradient at interface.jl:49 [inlined]
(::typeof(∂(gradient)))(::Tuple{Float64}) at interface2.jl:0
#31 at pinn2.jl:12 [inlined]
(::typeof(∂(#31)))(::Tuple{Float64}) at interface2.jl:0
#33 at none:0 [inlined]
(::typeof(∂(#33)))(::Float64) at interface2.jl:0
iterate at generator.jl:47 [inlined]
(::typeof(∂(iterate)))(::Tuple{Float64,Nothing}) at interface2.jl:0
mean at Statistics.jl:76 [inlined]
(::typeof(∂(mean)))(::Float64) at interface2.jl:0
mean at Statistics.jl:44 [inlined]
loss at pinn2.jl:17 [inlined]
(::typeof(∂(loss)))(::Float64) at interface2.jl:0
#150 at lib.jl:191 [inlined]
#1693#back at adjoint.jl:59 [inlined]
#14 at train.jl:83 [inlined]
(::Zygote.var"#54#55"{Zygote.Params,Zygote.Context,typeof(∂(#14))})(::Float64) at interface.jl:172
gradient(::Function, ::Zygote.Params) at interface.jl:49
macro expansion at train.jl:82 [inlined]
macro expansion at progress.jl:119 [inlined]
train!(::Function, ::Zygote.Params, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{}}}, ::Descent; cb::var"#23#24") at train.jl:80
(::Flux.Optimise.var"#train!##kw")(::NamedTuple{(:cb,),Tuple{var"#23#24"}}, ::typeof(Flux.Optimise.train!), ::Function, ::Zygote.Params, ::Base.Iterators.Take{Base.Iterators.Repeated{Tuple{}}}, ::Descent) at train.jl:78
top-level scope at pinn2.jl:31

“”"

and here is the code (with the manually differentiated line for g_grad commented out
“”"
using Flux
using Statistics

NNODE = Chain(x → , # Take in a scalar and transform it into an array
Dense(1,32,tanh),
Dense(32,1),
first) # Take first value, i.e. return a scalar
NNODE(0.5)

g = t → t*NNODE(t) + 1f0

g_grad = t → gradient(g,t)
ϵ = sqrt(eps(Float32))
#g_grad = t → (g(t+ϵ)-g(t))/ϵ

#loss() = mean(abs2(g_grad(t)[1] - cos(2πt)) for t in 0:0.01:1.0)
loss() = mean(abs2(g_grad(t)[1]-cos(2π
t)) for t in 0:1f-2:1f0)
#loss() = mean(abs2(((g(t+ϵ)-g(t))/ϵ) - cos(2π*t)) for t in 0:1f-2:1f0)

opt = Flux.Descent(0.01)
data = Iterators.repeated((), 5000)

iter = 0
cb = function () #callback function to observe training
global iter += 1
if iter % 50 == 0
display(loss())
end
end
display(loss())
Flux.train!(loss, Flux.params(NNODE), data, opt, cb=cb)
“”"
I know this shouldn’t be an issue - but I don’t know what I am doing wrong. Any help would be appreciated. Thanks

Did you find the solution?