Approximating the inverse of an expensive CDF function

I have a CDF for a real-valued random variable that is relatively expensive to compute and I require an approximate inverse to implement sampling. There exist techniques adapted to this use case. One technique, implemented in the UNU.RAN library, uses Hermite polynomial approximation; see Hörman 2002.

My question is whether this, or something comparable, has been implemented in Julia? Or maybe someone has a different suggestion for approximating the inverse.

There are lots of packages for polynomial interpolation in Julia that can be applied to approximate the inverses of expensive functions. See e.g. Approximate inverse of a definite integral - #3 by stevengj

I’m not sure if there is anything like this that is prepackaged to approximate inverse CDFs from data; presumably you want some kind of fit or spline if you have the numerical CDF (via the quantile function in the Statistics standard library, for example) evaluated on a grid. There are lots of packages in Julia for various forms of spline interpolation too, including monotonic Hermite splines — it should be straightforward to apply this to the quantile output, no?

PS. I split this post off into a new thread, rather than resurrecting a 3-year-old thread on a loosely related topic.

1 Like

If the goal is just to sample then doing inverse CDF is not necessarily the most common method.

Is the CDF itself computed with like a monte-carlo method or something, is that why it’s expensive? or maybe it’s a physical simulation or ???

If it’s just that you know the PDF but the integral of it requires numerical methods, then sometimes it’s best to do rejection sampling type stuff. Take a look at the ziggurat algorithm, or variations on that.

Is there a reason you absolutely have to use inverse CDF?