Assuming I have the following variable [0, 1]

```
using Downloads, CSV, DataFrames, Random
using Turing, Distributions
using GLMakie
using StatsFuns: logistic
y = vcat(rand(Beta(2, 10), 200),
rand(Uniform(0, 1), 100),
rand(Beta(10, 2), 200))
hist(y)
```

I would like to know if it was possible, within a Turing model, to transform this variable (dichotomize it to 0 and 1 if y > 0.5), and then fit a logistic model to this dichotomized variable.

I tried the following:

```
@model function model_choice(y)
# Prior on probability
choice_intercept ~ Normal(0, 1)
for i in 1:length(y)
# Logistic function
v = logistic(choice_intercept)
# Transform to binary choice
choice = ifelse(y[i] > 0.5, 1, 0)
# Likelihood
choice ~ Bernoulli(v)
end
end
fit = model_choice(y)
chains = sample(fit, NUTS(), 400)
```

While this successfully samples:

```
Summary Statistics
parameters mean std mcse ess_bulk ess_tail rhat ess_per_sec
Symbol Float64 Float64 Float64 Float64 Float64 Float64 Float64
choice_intercept -0.5054 0.0000 0.0000 1.4125 20.8078 2.0369 0.1158
choice 1.0000 0.0000 NaN NaN NaN NaN NaN
```

I am not sure I am doing this correctly as 1) `choice`

shows up in the results as a parameter and it probably shouldn’t be there; and 2) I can’t use `predict()`

on that model:

```
pred = predict(model_choice([(missing) for i in 1:length(y)]), chains)
```

```
ERROR: MethodError: no method matching ifelse(::Missing, ::Int64, ::Int64)
Closest candidates are:
ifelse(::Bool, ::Any, ::Any)
@ Base essentials.jl:647
```

I am assuming I am not doing this correctly. What is the best way to do that? Thanks for any pointers!