FWIW, I am not aware of any. But this should not be that complicated, eg
using Distributions, StatsBase, IterTools, LinearAlgebra
function make_mixture(h)
MixtureModel([Uniform(e...) for e in partition(h.edges[1], 2, 1)],
normalize(h.weights, 1))
end
should do it,
julia> h = fit(Histogram, randn(1000))
Histogram{Int64,1,Tuple{StepRangeLen{Float64,Base.TwicePrecision{Float64},Base.TwicePrecision{Float64}}}}
edges:
-4.0:1.0:4.0
weights: [1, 17, 145, 346, 330, 143, 17, 1]
closed: left
isdensity: false
julia> m = make_mixture(h)
MixtureModel{Uniform{Float64}}(K = 8)
components[1] (prior = 0.0010): Uniform{Float64}(a=-4.0, b=-3.0)
components[2] (prior = 0.0170): Uniform{Float64}(a=-3.0, b=-2.0)
components[3] (prior = 0.1450): Uniform{Float64}(a=-2.0, b=-1.0)
components[4] (prior = 0.3460): Uniform{Float64}(a=-1.0, b=0.0)
components[5] (prior = 0.3300): Uniform{Float64}(a=0.0, b=1.0)
components[6] (prior = 0.1430): Uniform{Float64}(a=1.0, b=2.0)
components[7] (prior = 0.0170): Uniform{Float64}(a=2.0, b=3.0)
components[8] (prior = 0.0010): Uniform{Float64}(a=3.0, b=4.0)
julia> mean(m)
-0.010999999999999937
for univariate and it should be easy to extend to multivariate.