Turing Sampler remembering values from previous runs

Below I am trying to build a bayesian network model. As part of model checking, I was comparing the prior distribution to posterior distribution. For my model the posterior is same as prior, no change. But what’s weird is that if I change the prior to a new value (say mean of the Normal dist. for parameter m from 5 to 0.5), in the new run my new posterior mean for parameter m will be centered around 5 (which was the prior and posterior mean from previous run). I am also attaching a plot that illustrates the problem.

using Pkg
Pkg.activate(".")
Pkg.add(Turing, LazyArrays, Plots, StatsPlots, DataFrames)
using Turing;
using LazyArrays;
using Plots, StatsPlots, DataFrames

# Generating Fake Data 
nstudents = 40 
nitems = 5
m = [1.2, 0.5]
b = [-0.5, 0.25, 0.125, 1.45, 1.00]
eff_theta = Matrix(undef, nstudents, nitems);
lambda_OA = [0.8, 0.2]
AF = rand(Categorical([0.6, 0.4]), 40)
OA = rand(Categorical(lambda_OA),40);
for i in 1:nstudents 
    for j in 1:nitems
        eff_theta[i,j] = minimum([m[1]*OA[i], m[2]*AF[i]]) - b[j]
    end
end

X = rand.(BernoulliLogit.(eff_theta))

# Model Definition 
@model function bn(X,AF)
    nstudents, n_items = size(X)
    θ = Matrix(undef, nstudents, n_items)

    m ~ filldist(truncated(Normal(5,0.1), 0, Inf), 2)
    b ~ filldist(Normal(0,1), n_items)

    λ_OA ~ Dirichlet([2,2])                            
    OA ~ filldist(Categorical(λ_OA), nstudents)  

    for i in 1:nstudents 
        for j in 1:n_items
            θ[i,j] = minimum([m[1]*OA[i], m[2]*AF[i]]) - b[j]
        end
    end
    X .~ BernoulliLogit.(θ)
end

prior_chain = sample(bn(X,AF), Prior(),1000)

post_chain = sample(bn(X,AF), 
                SMC(), 
                MCMCThreads(),2_000, 4)

begin
        p1 = density(prior_chain[Symbol("m[1]")], color=:black, linewidth=1.8, label="prior")
        density!(p1, post_chain[Symbol("m[1]")])
end