This package provides a research-grade collection of additional probability distributions not included in the base Distributions.jl, while remaining fully compatible with its API.
Main Features
Univariate distributions such as Burr, Lomax, Dagum, and Kotz families.
Multivariate extensions including Elliptical and Heavy-Tailed models.
High-precision QMC-based algorithms for Gaussian and Student t CDFs.
Full interoperability with pdf, cdf, rand, fit, and statistical functions.
Compatible with the JuliaStats ecosystem (StatsBase, Distributions, etc.).
Quick Example
using AdditionalDistributions, Distributions
# Univariate Burr distribution
d = Burr(α = 2.0, c = 3.0)
pdf(d, 1.0)
rand(d, 5)
# Multivariate t-distribution
Σ = [1.0 0.6; 0.6 1.0]
t = MvTStudent(ν = 4, μ = [0, 0], Σ)
cdf(t, [0.3, 0.7])
Contributions
Feedback, feature suggestions, and pull requests are very welcome!
You can also open issues for new distributions or additional numerical methods.
With my PhD student, we recently used MvNormalCDF.jl, so I have a bunch of related questions.
I don’t understand the method you are using for cdf of MvNormal. It is written on the readme QMC Sobol but here it says MVSORT + QMC (Richtmyer + shifts).
Is your method in a paper? Is it just a change of variable a la Genz to have only [0,1] integrals and then just a different QMC method?
What are the guarantees provided by the error terms?
I don’t understand how you can have a fair comparison for speed while having different tolerances for final error.
Our tests showed us that the Genz method coded in MvNormalCDF.jl (maybe not optimally but already great) was, in our case study, better and more robust than most other literature methods, so I am surprised you have such good results.
My test of the #master of your package shows a factor of almost 10 in your favor against MvNormalCDF.jl for your 25-dim example (for similar error). It makes me really curious!
Do you think you could interface with QuasiMonteCarlo.jl to leverage other (R)QMC methods?
Hello, thank you so much for trying the package and for your interest. It’s very valuable for my work.
Clarification about the method and the README.
I apologize for the confusion. The README is outdated: I initially tried Sobol, but the current version uses MVSORT + QMC (Richtmyer + Cranley–Patterson shifts), with an antithetical option. That is, the same integrand “in the Genz style” (sequential conditioning with pivoting) and a different RQMC engine.
Relationship with Genz–Bretz.
The core is that of Genz: normalization to correlations, ordering/pivoting (MVSORT), construction of the integrand by conditional masses, and propagation via (Φ⁻¹). The difference lies in the integration rule (Richtmyer lattice with random shifts), as well as implementation details (triangular packing, pre-allocation, etc.).
Error Estimation. The reported “error” is a randomized shift estimate; it is not a deterministic bound. It is the MC/RQMC analogue of the standard error commonly used with Genz-Bretz. The tolerance criterion is applied uniformly to this variance between shifts. I am still refining this aspect to better align accuracy and time in all cases.
Speed/Accuracy Comparisons. I strive to compare with the same effective budget (m) and always show (value, error) for both methods. In my tests, Richtmyer’s rule with 12 shifts yields typical errors of (10⁻⁵)–(10⁻⁶) with less overhead than the adaptive routine, hence the observed gains. If you have a specific case study (Σ, a, b, ν, m) where you see significant differences, I’d love to reproduce it and compare output/times.
Sobol / QuasiMonteCarlo.jl
I also tested Sobol.jl and QuasiMonteCarlo.jl. In my tests, they didn’t offer a clear advantage over Richtmyer (and added some overhead). Even so, my idea is to expose a pluggable backend for RQMC (Sobol, Korobov, generated lattices, etc.). If we can demonstrate better performance with another generator, I’d be happy to integrate it.
Next steps.
I will update the README to accurately reflect the current backend, precisely document the error estimate, and open the extension point for other RQMC generators. Any optimization suggestions (speed, memory, modularity) or pull requests are very welcome.
I look forward to hearing about your specific case and your recommendations. Thanks again for your interest and constructive criticism!
Thanks! I would be actually very curious to test different RQMC strategies, as I worked with QMC before. However, I am not sure exactly where to do it in your code. Maybe we could chat at some point if you are also interested.
I was always wondering why no one I tried toying with that, as the choice seemed rather arbitrary (or I missed some deep reason why this QMC should work better in that case).
My guess: Genz used an easy to code QMC sequence (and at the time of his paper there were not as much) and then people just copied his code.
Nice that you were able to improve the implementation of the pivot method could it benefit MVNormalCDF.jl too?
First, I apologize for the delay; I’ve been quite busy this week
Thank you for your interest. It would be very interesting to explore different RQMC strategies with you
I think the choice of Genz wasn’t so arbitrary because adaptive Cholensky conditioning leaves a smooth and stable integrand.
So, in these cases, lattice rule generators perform well… (plus, Genz has these values completely predefined in its code), but it would be quite interesting to find a way to use other generators for comparison and give the user the freedom to choose.
My package maintains the logic, but the engineering changes, “attempting” to take advantage of the opportunities Julia offers. However, I believe there’s still much room for improvement, especially in addressing the errors (by the way, when I used QuasiMonteCarlo.jl, the error was very small, but it significantly reduced speed).
To connect other RQMC packages in the code, the piece to modify is the point generator within _qmc_integrate (where I currently do…). Everything else remains the same.
As for benefiting MvNormalCDF.jl, I don’t know how open they are to this type of experimentation, haha. It’s a standard package that’s difficult to modify.
Actually, what you observe is what I would expect with “better” generator for smooth integrand (like Sobol + Scrambling).
That is why comparing at fixed m is tricky. Some RQMC have error convergence like O(m^{-3/2}) while others are “just” O(m^{-1}) (baseline is MC O(m^{-1/2})).
Meaning that you can achieve same accuracy with smaller m. So maybe the “best” RQMC generator are more computationally costly (scrambling is costly) but you need 10 time less point so it is worth it. (Not to say QuasiMonteCarlo.jl cannot be improved in terms of speed.)
I’ll try to take a look and maybe contact you when I have time.