Bayesian multidimensional scaling in Turing.jl, no samples accepted?

I am trying to implement bayesian multidimensional scaling in Turing (from this paper: https://sites.stat.washington.edu/raftery/Research/PDF/oh2001.pdf). I have a model that I think is correct, but it seems no samples get accepted, so the sampler never terminates (or updates the ETA).

Here is my code, I use the s-curve from ManifoldLearning as test data.

using Turing, StatsPlots, Random, LinearAlgebra, MultivariateStats, ManifoldLearning
using Plots


@model function bmds(p_max,d)
    p ~ DiscreteUniform(1,p_max)
    n = size(dm)[1]
    xs = Vector{Vector{Real}}(undef,n)

    #priors for variance
    σ2 ~ Exponential()

    #priors for points, just put them at zero
    for i in 1:n
        xs[i] ~ MvNormal(zeros(p),Diagonal(ones(p)))
    end

    for i in 1:n
        for j in 1:(i-1)
            δ_ij = sqrt(sum((xs[i] .- xs[j]).^2))
            d[i,j] ~ truncated(Normal(δ_ij,σ2); lower = 0.0)
        end
    end


end
#test dataset
test_dataset = ManifoldLearning.scurve()[1]
dist_matrix(pts) = reshape([sum((pts[:,i] .- pts[:,j]).^2) for i in 1:size(pts)[2], j in 1:size(pts)[2]],(size(pts)[2],size(pts)[2]))
dm = dist_matrix(test_dataset)
chain = sample( bmds(3, dm), SMC(10_000), MCMCThreads(), 1_500, 10)

describe(chain)

in general, I am not sure how to peer into the @model macro.

edit: Hong Ge on Slack pointed out a mistake in the model specification, and that HMC does not support discrete variables. However, this model still does not yet work.

Thanks to Hong Ge on Slack, it now works with p fixed:

using Turing, StatsPlots, Random, LinearAlgebra, MultivariateStats, ManifoldLearning, ReverseDiff
using Plots
Turing.setadbackend(:reversediff)

@model function bmds(p,d, ::Type{T} = Float64) where {T}
    n = size(dm)[1]
    xs = Vector{Vector{T}}(undef,n)
    #priors for variance
    σ2 ~ Exponential()
    #priors for points, just put them at zero with wide prior
    for i in 1:n
        xs[i] ~ MvNormal(zeros(p),Diagonal(fill(10,p)))
    end

    for i in 1:n
        for j in 1:(i-1)
            δ_ij = 0.0
            for k in 1:p
                δ_ij += (xs[i][k] - xs[j][k])^2
            end
            d[i,j] ~ truncated(Normal(sqrt(δ_ij),σ2); lower = 0.0)
        end
    end
end

#test dataset
test_dataset = ManifoldLearning.scurve()[1]
dist_matrix(pts) = reshape([sum((pts[:,i] .- pts[:,j]).^2) for i in 1:size(pts)[2], j in 1:size(pts)[2]],(size(pts)[2],size(pts)[2]))
dm = dist_matrix(test_dataset)
m = bmds(2,dm)

chain = sample( bmds(2, dm), HMC(0.1,5), MCMCThreads(), 100, 4)
describe(chain)

The type parameter is added for type stability.

It is quite slow though.