Pre allocating interpolant with Interpolations.jl

Hi,

I have a loop where I need to create new interpolants each time. I would like to know if I can somehow pre allocate the interpolant in Interpolations.jl so I don’t have to create new memory allocation at each iteration of the loop. The loop is iterating through the same grid each time with only the values to interpolate that changes.

Here is a MWE taking the example in Interpolations.jl:

xs = 1:0.2:5
A = log.(xs)

# Create linear interpolation object without extrapolation
for _ in 1:5
    interp_linear = linear_interpolation(xs, A)
end

I tried to use the type of inter_linear to pre allocate but it doesn’t seem to work:

interp_linear = Interpolations.Extrapolation{Float64, 1, ScaledInterpolation{Float64, 1, Interpolations.BSplineInterpolation{Float64, 1, Vector{Float64}, BSpline{Linear{Throw{OnGrid}}}, Tuple{Base.OneTo{Int64}}}, BSpline{Linear{Throw{OnGrid}}}, Tuple{StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64}}}, BSpline{Linear{Throw{OnGrid}}}, Throw{Nothing}}

for _ in 1:5
    interp_linear .= linear_interpolation(xs, A)
end

error with ERROR: CanonicalIndexError: setindex! not defined for Interpolations.Extrapolation{Float64, 1, ScaledInterpolation{Float64, 1, Interpolations.BSplineInterpolation{Float64, 1, Vector{Float64}, BSpline{Linear{Throw{OnGrid}}}, Tuple{Base.OneTo{Int64}}}, BSpline{Linear{Throw{OnGrid}}}, Tuple{StepRangeLen{Float64, Base.TwicePrecision{Float64}, Base.TwicePrecision{Float64}, Int64}}}, BSpline{Linear{Throw{OnGrid}}}, Throw{Nothing}}

1 Like

Sorry, this is not quite how Julia works in general. Specifically, storing a type in a variable is not a way to preallocatw memory. Also, trying to do broadcast assignment into the type will not perform an in-place operation.

What night help are functions that end with an excalamation mark, such as Interpolations.interpolate! however that does not do what you would like.

There are many interpolation packages. BSplineKit.jl’s interpolate! looks like it my do what you would like.

Thx for the answer! I will have a look on other packages then.

If you are doing linear interpolation, you can precompute the interpolation weights (even with Interpolations.jl although you need to dig into the internals a bit).

Beyond linear, I think you need the observed y values, so doesn’t the interpolation change every iteration anyway? In this case the best you can do is to try to reuse memory via in-place operations.

Perhaps you are trying to solve a problem that we already solved in GeoStats.jl? Our interpolation routines are efficient: they preallocate a buffer for the results, and fill the buffer during traversal of large geospatial domains.

While it may not be a good idea, you can change the array your interpolating over if you’re using gridded interpolation:

xs = collect(1:0.2:5)
ny = 10
A = [log(x)+y for x in xs, y in 1:ny]

itp = linear_interpolation(xs, A[:,1])  # Lets call this the placeholder interpolation object

# Replace the `coefs` field of the interpolation object with the new data
for _ in 1:ny
    for iy = 1:ny
        @views itp.itp.coefs[:] = A[:,iy]
        @assert itp(1.) == iy "$(itp(1.0)) == $(iy)"
    end
end

I do something like this in my code, it seemed to help on performance as I was re-creating some tens of thousands of interpolation objects. But given @mkitti reply above, I’m sure he’s right and if you’re doing this intelligently from the beginning (unlike me :slight_smile:) it won’t help.

Regarding weights, I was never able to make it work, though this issue has some details: Reuse weights with extrapolation · Issue #484 · JuliaMath/Interpolations.jl · GitHub