Quick & dirty online interpolation in Rn

I am estimating the model which requires mapping a set of parameters x to an implicit solution y = f(x), defined by some F(x, y) = 0. I am solving this problem about 40000 times in a typical run. The cost of calculating an exact f is around 30–120s from scratch, with no good initial guess.

I can obtain y from arbitrary x pretty robustly, but a good initial guess y_0(x) helps me speed this up by 10–20x. For the purpose of this discussion, imagine x \in \mathbb{R}^n, where n is around 20–80.

I thought what would really help me is using the values I have calculated, and then doing some kind of quick & dirty interpolation (the red dot below) or extrapolation (the green dot below). The hollow circles are points I have solved for already.

I have thought of a radial basis function interpolation, and found

but this does not allow updating when I solve for new points.

What algorithms / packages would you recommend for

  1. an interpolation / extrapolation that can be updated quickly (“online”),

  2. which does not have to be very accurate or even very good, since the guess is refined,

  3. (ideally does not even need to retain all known values, where the points are dense I would keep fewer)

Any interpolation that is based on linear least-squares fitting can be made “online” by using recursive-least squares to update the parameters.

http://pfister.ee.duke.edu/courses/ece586/ex_proj_2008.pdf

For example, the expansion implemented in GitHub - baggepinnen/BasisFunctionExpansions.jl: Basis Function Expansions for Julia would be easy to adapt to that use case. I’m not suggesting you use this package, it’s very old and hasn’t been maintained in a long time (do look at the nice GIF in the readme though :slight_smile: ), but the general concept should be straightforward to use.

1 Like

I don’t know of any package, which already provides all you need, but I think RBFs can in general fulfill your requirements. Regarding the update strategy, this is possible for positive definite RBFs by efficiently updating the Cholesky decomposition of the system matrix, see section 8.4.3 in Approximation Theory and Algorithms for Data Analysis | SpringerLink. I do have this feature on my todo list for my RBF interpolation package KernelInterpolation.jl, but I never found the time to implement it. However, many ingredients are already there and it should not be too difficult to implement the update strategy based on what is already there. And of course any contribution is welcome :slight_smile:

1 Like