Sampling from custom distribution: "No method matching iterate..."

Hello fellow Julians,

I’m trying to use a custom distribution/sampleable for inference in Turing.jl, and I’ve encountered a problem I can’t solve. Following the instructions on the Turing docs, I made the following minimal working example – a uniform distribution on [0,L]:

    struct MyUniformDist <: ContinuousUnivariateDistribution
        L::Real
    end
    Distributions.logpdf(d::MyUniformDist, x::Real) = 0 ≤ x ≤ d.L ? log(1/d.L) : -Inf
    Distributions.rand(rng::AbstractRNG, d::MyUniformDist) = rand(rng) * d.L
    Distributions.minimum(d::MyUniformDist) = 0.
    Distributions.maximum(d::MyUniformDist) = d.L

I try to sample from it,

foo = MyUniformDist(3)
rand(foo)

and get the following error:

ERROR: MethodError: no method matching iterate(::MyUniformDist)

Closest candidates are:
  iterate(::LLVM.StructTypeElementSet)
   @ LLVM ~/.julia/packages/LLVM/bzSzE/src/core/type.jl:246
  iterate(::LLVM.StructTypeElementSet, ::Any)
   @ LLVM ~/.julia/packages/LLVM/bzSzE/src/core/type.jl:246
  iterate(::CSV.Chunks)
   @ CSV ~/.julia/packages/CSV/XLcqT/src/chunks.jl:91
  ...

Stacktrace:
 [1] copyto!(dest::Vector{Float64}, src::MyUniformDist)
   @ Base ./abstractarray.jl:943
 [2] _collect(cont::UnitRange{Int64}, itr::MyUniformDist, ::Base.HasEltype, isz::Base.HasLength)
   @ Base ./array.jl:765
 [3] collect(itr::MyUniformDist)
   @ Base ./array.jl:759
 [4] quantile(itr::MyUniformDist, p::Float64; sorted::Bool, alpha::Float64, beta::Float64)
   @ Statistics /Applications/Julia-1.10.app/Contents/Resources/julia/share/julia/stdlib/v1.10/Statistics/src/Statistics.jl:1086
 [5] rand(rng::Random.TaskLocalRNG, d::MyUniformDist)
   @ Distributions ~/.julia/packages/Distributions/uuqsE/src/univariates.jl:157
 [6] rand(::MyUniformDist)
   @ Distributions ~/.julia/packages/Distributions/uuqsE/src/genericrand.jl:22
 [7] top-level scope
   @ ~/path/to/my/file.jl:268

Can anyone give me some insight into what’s going on?

Thanks so much,

Alex M

Edit: fixed some remnants of my actual use-case in the MWE, but the problem remains the same.

P.S. Here’s the output of ]st (I have a lot of packages that I don’t use anymore):

[cbdf2221] AlgebraOfGraphics v0.8.13
  [c9ce4bd3] ArchGDAL v0.10.4
  [6e4b80f9] BenchmarkTools v1.5.0
  [024491cd] BetaML v0.12.1
  [e2ed5e7c] Bijections v0.1.9
  [a134a8b2] BlackBoxOptim v0.6.3
  [336ed68f] CSV v0.10.15
  [13f3f980] CairoMakie v0.12.15
  [e2e10f9a] CatBoost v0.3.5
  [aaaa29a8] Clustering v0.15.7
  [35d6a980] ColorSchemes v3.27.0
⌅ [5ae59095] Colors v0.12.11
  [861a8166] Combinatorics v1.0.2
  [7d11a335] ConcaveHull v1.1.0
⌅ [ae264745] Copulas v0.1.13
⌃ [a93c6f00] DataFrames v1.6.1
  [864edb3b] DataStructures v0.18.20
  [7806a523] DecisionTree v0.12.4
  [8bb1440f] DelimitedFiles v1.9.1
  [39dd38d3] Dierckx v0.5.3
⌃ [c894b116] DiffEqJump v8.6.2
⌃ [0c46a032] DifferentialEquations v7.11.0
  [b4f34e82] Distances v0.10.12
  [31c24e10] Distributions v0.25.112
  [ab853011] EvoLinear v0.4.3
  [f6006082] EvoTrees v0.16.7
⌃ [587475ba] Flux v0.14.21
  [f6369f11] ForwardDiff v0.10.36
  [38e38edf] GLM v1.9.0
  [2fb1d81b] GeoArrays v0.9.0
  [61d90e0f] GeoJSON v0.8.1
  [523d8e89] Gillespie v0.1.0 `https://github.com/sdwfrost/Gillespie.jl#master`
⌃ [af5da776] GlobalSensitivity v2.4.0
  [7073ff75] IJulia v1.25.0
  [a98d9a8b] Interpolations v0.15.1
  [3587e190] InverseFunctions v0.1.17
  [c3a54625] JET v0.9.12
⌅ [033835bb] JLD2 v0.4.53
  [9da8a3cd] JLSO v2.7.0
⌃ [ccbc3e58] JumpProcesses v9.10.1
  [5ab0869b] KernelDensity v0.6.9
  [23b0397c] KissSmoothing v1.0.8
  [b1bec4e5] LIBSVM v0.8.1
  [984bce1d] LambertW v1.0.0
  [a5e1c1ea] LatinHypercubeSampling v1.9.0
  [b4f0291d] LazySets v2.14.2
  [d3d80556] LineSearches v7.3.0
  [bdcacae8] LoopVectorization v0.12.171
  [f0e99cf1] MLBase v0.9.2
  [add582a8] MLJ v0.20.7
  [a7f614a8] MLJBase v1.7.0
  [c6f25543] MLJDecisionTreeInterface v0.4.2
  [094fc8d1] MLJFlux v0.6.0
  [caf8df21] MLJGLMInterface v0.3.7
  [61c7150f] MLJLIBSVMInterface v0.2.1
  [6ee0df7b] MLJLinearModels v0.10.0
  [1b6a4a23] MLJMultivariateStatsInterface v0.5.3
  [5ae90465] MLJScikitLearnInterface v0.7.0
  [54119dfa] MLJXGBoostInterface v0.3.11
  [ff71e718] MixedModels v4.26.1
  [af6c499f] MutableNamedTuples v0.1.3
  [2774e3e8] NLsolve v4.5.1
  [77ba4419] NaNMath v1.0.2
  [636a865e] NearestNeighborModels v0.2.3
  [b8a86587] NearestNeighbors v0.4.20
  [429524aa] Optim v1.9.4
  [90014a1f] PDMats v0.11.31
  [2c7acb1b] PDMatsExtras v2.8.0
  [9b87118b] PackageCompiler v2.1.21
  [c3e4b0f8] Pluto v0.20.1
  [7f904dfe] PlutoUI v0.7.60
  [e409e4f3] PoissonRandom v0.4.4
  [647866c9] PolygonOps v0.1.2
  [67491407] Polyhedra v0.7.8
  [27ebfcd6] Primes v0.5.6
  [92933f4c] ProgressMeter v1.10.2
  [1fd47b50] QuadGK v2.11.1
  [8a4e6c94] QuasiMonteCarlo v0.3.3
  [6f49c342] RCall v0.14.6
  [e6cf234a] RandomNumbers v1.6.0
  [a3a2b9e3] Rasters v0.12.0
  [f2b01f46] Roots v2.2.1
  [cdeec39e] SIRUS v2.0.1
  [8e980c4a] Shapefile v0.13.1
  [05bca326] SimpleDiffEq v1.11.1
  [276daf66] SpecialFunctions v2.4.0
  [860ef19b] StableRNGs v1.0.2
  [90137ffa] StaticArrays v1.9.8
  [2913bbd2] StatsBase v0.34.3
  [f3b207a7] StatsPlots v0.15.7
  [fd094767] Suppressor v0.2.8
  [bd369af6] Tables v1.12.0
⌃ [fce5fe82] Turing v0.34.1
  [009559a3] XGBoost v2.5.1
  [37e2e46d] LinearAlgebra
  [9a3f8284] Random
  [2f01184e] SparseArrays v1.10.0

I fixed it. The problem was something to do with conflicting methods for rand in the Random and RandomNumbers packages.

I guess that’s a real reason to clean up my package environment after all.

The main problem is that you need to import the methods, including rand in order to extend them. Try the following:

using Distributions 
import Distributions: logpdf, rand, maximum, minimum
using Random 

struct MyUniformDist <: ContinuousUnivariateDistribution
    L::Real
end
logpdf(d::MyUniformDist, x::Real) = 0 ≤ x ≤ d.L ? log(1/d.L) : -Inf
rand(rng::AbstractRNG, d::MyUniformDist) = rand(rng) * d.L
minimum(d::MyUniformDist) = 0.
maximum(d::MyUniformDist) = d.L

foo = MyUniformDist(3)
rand(foo)
1 Like

And you are right about using Random. That must be available for AbstractRNG. You do not need to import rand if you use Distributions.rand.

Thanks so much @Christopher_Fisher for clarifying that & making it useful to future readers! I appreciate it.

1 Like