When I create a Normal one-dimensional mixture as,
d = MixtureModel([Normal(-1,1), Normal(1, 1)], [0.5,0.5])
it turns out that the pdf is not type stable:
julia> @code_warntype pdf(d,0.3)
MethodInstance for Distributions.pdf(::MixtureModel{Univariate, Continuous, Normal{Float64}, Float64}, ::Float64)
from pdf(d::UnivariateMixture{Continuous}, x::Real) in Distributions at /home/sgaure/.julia/packages/Distributions/Xrm9e/src/mixtures/mixturemodel.jl:419
Arguments #self#::Core.Const(Distributions.pdf)
d::MixtureModel{Univariate, Continuous, Normal{Float64}, Float64}
x::Float64
Body::Any
1 ─ %1 = Distributions._mixpdf1(d, x)::Any
└── return %1
The pdf is used for integration inside a likelihood optimization, and this type instability slows it down by a non-trivial factor. This is of course quite simple to get around by coding the pdf myself for the particular flavours of it I need, but it ought to be handled automatically. Also, the pdf does quite some allocations. This could be avoided, I think.
Is there some simple trick to make type inference work?
I’m not seeing this type-instability. Can you double-check in a clean environment and share the outputs of Pkg.status() and versioninfo() as done below?