# Issue with BNN example from Turing Tutorial

Hi, student and first time poster here.
I’ve been trying to work through the tutorials from the Turing documentation, and can’t seem to get the Bayesian Neural Network to run. More specifically, it seems to stem from the model being passed into the sample function.
I get the same error from both the Generic Bayesian Neural Network example and the more in depth example earlier in the tutorial. Here’s a link for the curious:
https://turing.ml/dev/tutorials/3-bayesnn/

``````# Import libraries.
using Turing, Flux, Plots, Random
# Specify the network architecture.
network_shape = [
(3,2, :tanh),
(2,3, :tanh),
(1,2, :σ)]

# Regularization, parameter variance, and total number of
# parameters.
alpha = 0.09
sig = sqrt(1.0 / alpha)
num_params = sum([i * o + i for (i, o, _) in network_shape])

# This modification of the unpack function generates a series of vectors
# given a network shape.
function unpack(θ::AbstractVector, network_shape::AbstractVector)
index = 1
weights = []
biases = []
for layer in network_shape
rows, cols, _ = layer
size = rows * cols
last_index_w = size + index - 1
last_index_b = last_index_w + rows
push!(weights, reshape(θ[index:last_index_w], rows, cols))
push!(biases, reshape(θ[last_index_w+1:last_index_b], rows))
index = last_index_b + 1
end
return weights, biases
end

# Generate an abstract neural network given a shape,
# and return a prediction.
function nn_forward(x, θ::AbstractVector, network_shape::AbstractVector)
weights, biases = unpack(θ, network_shape)
layers = []
for i in eachindex(network_shape)
push!(layers, Dense(weights[i],
biases[i],
eval(network_shape[i][3])))
end
nn = Chain(layers...)
return nn(x)
end

# General Turing specification for a BNN model.
@model bayes_nn(xs, ts, network_shape, num_params) = begin
θ ~ MvNormal(zeros(num_params), sig .* ones(num_params))
preds = nn_forward(xs, θ, network_shape)
for i = 1:length(ts)
ts[i] ~ Bernoulli(preds[i])
end
end

# Set the backend.

# Perform inference.
num_samples = 500
ch2 = sample(bayes_nn(hcat(xs...), ts, network_shape, num_params), NUTS(0.65), num_samples);
``````

Here’s the stacktrace, it’s a long one:

``````ERROR: LoadError: UndefVarError: forward not defined
Stacktrace:
[3] track at C:\Users\nickp\.julia\packages\Tracker\JhqMQ\src\Tracker.jl:52 [inlined]
[8] logpdf_with_trans at C:\Users\nickp\.julia\packages\Bijectors\hFDmJ\src\Bijectors.jl:417 [inlined]
[10] macro expansion at C:\Users\nickp\.julia\packages\Turing\fbY6B\src\core\compiler.jl:101 [inlined]
[11] macro expansion at .\untitled-f87f5efba838ef24ae6689ad3bf587e2:91 [inlined]
[13] #_#3 at C:\Users\nickp\.julia\packages\Turing\fbY6B\src\Turing.jl:72 [inlined]
[14] Model at C:\Users\nickp\.julia\packages\Turing\fbY6B\src\Turing.jl:72 [inlined]
[15] runmodel! at C:\Users\nickp\.julia\packages\Turing\fbY6B\src\core\RandomVariables.jl:716 [inlined]
[17] #20 at C:\Users\nickp\.julia\packages\Tracker\JhqMQ\src\back.jl:148 [inlined]
[19] forward(::Function, ::Array{Float64,1}) at C:\Users\nickp\.julia\packages\Tracker\JhqMQ\src\back.jl:148
[22] ∂logπ∂θ at C:\Users\nickp\.julia\packages\Turing\fbY6B\src\inference\hmc.jl:403 [inlined]
[26] #find_good_eps at C:\Users\nickp\.julia\packages\Turing\fbY6B\src\inference\hmc.jl:0 [inlined]
[32] Sampler at C:\Users\nickp\.julia\packages\Turing\fbY6B\src\inference\hmc.jl:305 [inlined]
[33] #sample#2(::Nothing, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(sample), ::Turing.Model{Tuple{:θ},Tuple{:xs,:ts,:network_shape,:num_params},var"##inner_function#483#83",NamedTuple{(:xs,
Try `]add DistributionsAD` and if you have it, then `]up DistributionsAD`.