Turing.jl: Sometimes this error is thrown: Cannot `convert` an object of type Geometric{Float64} to an object of type Dirac{Float64}

Sometimes, but not always, this error is thrown for the model below when using the SMC() sampler:

ERROR: LoadError: CTaskException:
MethodError: Cannot `convert` an object of type Geometric{Float64} to an object of type Dirac{Float64}
Closest candidates are:
  convert(::Type{T}, ::T) where T at essentials.jl:205
  Dirac{T}(::Any) where T at /home/hakank/.julia/packages/Distributions/Xrm9e/src/univariate/discrete/dirac.jl:22

Here’ s the model (it’s a port of a BLOG model for counting the number of logins for honest/dishonest people):

using Turing


@model function person_login()

    # Number of people (exclude 0 people)
    numPeople ~ truncated(Poisson(5),0,Inf)
    
    # Is this person honest?
    honest ~ filldist(Bernoulli(0.9),numPeople)   
    
    # An honest person have just one login
    login = tzeros(numPeople)
    for p in 1:numPeople
        if honest[p] == true
            login[p] ~ Dirac(1.0) 
        else
            login[p] ~ Geometric(0.8) 
        end
    end
end

model = person_login()
chns = sample(model, SMC(), 10_000)

display(chns)

Why does this sometimes work and sometimes not? And: is there another way to state this in a better way:

    for p in 1:numPeople
        if honest[p] == true
            login[p] ~ Dirac(1.0) 
        else
            login[p] ~ Geometric(0.8) 
        end
    end

It don’t seems to matter what the probability of honest is, I got the same error for probability of honest is 0.5 as for 0.90.

Versions:

  • Julia: v1.6.2
  • Turing v0.16.6
  • Distributions v0.24.18

I should also mention that other samplers (always) give some different errors:

  • Running PG(5) throws DimensionMismatch("tried to assign 5 elements to 6 destinations")
  • MH(): LoadError: DimensionMismatch("Inconsistent array dimensions.")

So something is weird with this model…

Since you’re trying to do inference, where are you planning to introduce data in the model? e.g. will login be the data? If so, it should be passed as an argument to the model.

1 Like

@sethaxen Thanks, you have a very good point. There’s no data for the model, it’s just an “exploratory” model. Using Prior() seems to work without any problem.

But isn’t it strange with the errors for the different samplers? Or it that to be expected if there is no observed data?

WebPPL (and BLOG) don’t have any problems with these kind of “exploratory” models without observable data for different sampler, but I might have to be a little more careful about this in Turing.jl.

Just a heads up here: you can do away completely with the Bernoulli and the if by just using a Mixture of Dirac and Geometric.

But even so, I don’t think this should cause issues, i.e. it’s a bug, so feel free to raise an issue over on Github! Pretty certain SMC should be supporting dynamic models.

I’ll also bring this to attention to the people on our team who work on the SMC-parts :+1:

EDIT: So technically we support this but the default impl of most samplers, e.g. SMC, assume that the support of each variable is fixed (the parameter space is allowed to change though). In the future you’ll be able to tell the samplers that “hey sampler, this model actually has changing support so make sure you use an internal representation that supports it”. As of right now there’s no way for the sampler to know and for perf-reasons they then usually make some assumptions :confused:

A hotfix is to introduce a “wrapper” distribution

struct DynamicContinuousUnivariateDist <: ContiuousUnivariateDistribution
   dist
end

and implement logpdf and rand for it (see Distributions.jl). This will avoid the conversion issue.

1 Like

@torfjelde Thanks for the suggestions.

I’ll submit an issue about this.

And I’ll try to understand how to implement a “wrapper” distribution. But shouldn’t it be struct DynamicDiscreteUnivariateDist <: ContiuousUnivariateDistribution since both Dirac and Geometric are discrete distributions?

Regarding MixtureModel (which I assume you refer to with Mixture), I tried the following:

@model function person_login_mixed_dist()

    # Number of people (exclude 0 people)
    numPeople ~ truncated(Poisson(5),0,Inf)
    
    # Is this person honest?
    honest ~ filldist(Bernoulli(0.9),numPeople)   
    sumHonest ~ Dirac(sum(honest))
    
    # An honest person have just one login 
    login ~ filldist(MixtureModel(Normal[Dirac(1.0),Geometric(0.8)]),numPeople)  # <---

end
model = person_login_mixed_dist()
chns = sample(model, Prior(), 10_000)

but get a similar error regarding convert and Dirac{Float64}:

ERROR: LoadError: MethodError: Cannot `convert` an object of type 
  Dirac{Float64} to an object of type 
  Normal
Closest candidates are:
  convert(::Type{Normal}, ::NormalCanon) at /home/hakank/.julia/packages/Distributions/Xrm9e/src/univariate/continuous/normalcanon.jl:30
  convert(::Type{T}, ::T) where T at essentials.jl:205
...

Exact the same error is thrown a loop version:

    # ...
    login = tzeros(numPeople)
    for p in 1:numPeople
        login[p] ~ MixtureModel(Normal[Dirac(1.0),Geometric(0.8)])
    end
    # ...

Perhaps I’ve misused MixtureModel?

Also, even if this would work, it doesn’t really solve the complete modeling problem of the if honest[p] Dirac(1) else Geometric(0.8) end, since Dirac(1) should be used only when a person is honest.

Update: I’ve now reported it to Turing.jl: https://github.com/TuringLang/Turing.jl/issues/1669