Slow cubature

HCubature is far, far, more efficient for this example — it takes only 780 evaluations to compute the integral to 6–7 more digits than you have with divonne. I’m not actually sure what Cuhre is doing here, because the Genz–Malik algorithm is only for dimensions > 1 IIRC; in 1d hcubature uses an order-7 Gauss–Kronrod rule.

julia> I, E, count = hcubature_count(x -> f(x[1]), (0,), (1,), atol=1e-9, initdiv=20)
(0.00017724538509052284, 8.334741450662514e-13, 780)

julia> expected(0.5, 0.0001) - I
2.8758462625178005e-17

(I’m using hcubature_count from this post to report the function-evaluation count. I’m passing initdiv=20 since otherwise it tends to miss the sharp peak with a low-order rule unless you set the tolerance to be extremely small.)

In general, I wouldn’t expect Monte–Carlo algorithms, even with importance sampling like Divonne, to be particularly efficient for low-dimensional integrals at high accuracy.

PS. For a 1d problem where you know the approximate width and location of a sharp peak, it’s even better to use quadgk and pre-subdivide the integration domain into a small region around the peak + remainder: only 195 evaluations for slightly better accuracy than hcubature above

julia> Iq, _ = quadgk_count(f, 0, 0.499, 0.501, 1, atol=1e-9)
(0.0001772453850905604, 5.1230015575457897e-11, 195)

julia> expected(0.5, 0.0001) - Iq
-8.809142651444724e-18

But, of course, if you know this much about your integrand you can often give various forms of analytical “help” to the quadrature algorithm.