I solved my problem. Given the data frame:
julia> df
7ร3 DataFrame
Row โ ฮ0 ฮests Eps
โ Float64 Any Arrayโฆ
โโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
1 โ 1.0 0.85:0.025:1.0 [-6895.08, -5862.32, -4767.1, -3โฆ
2 โ 0.975 0.85:0.025:1.0 [-5891.23, -4795.81, -3639.82, -โฆ
โฎ โ โฎ โฎ โฎ
6 โ 0.875 0.85:0.025:1.0 [-1243.74, 3.61433e-12, 1177.96,โฆ
7 โ 0.85 0.85:0.025:1.0 [1.13868e-10, 1184.64, 2260.65, โฆ
The following code can be used to create the 2D approximation function:
using FastChebInterp, DataFrames
order = Int64(floor(2*sqrt(length(df.ฮests[1]))))
println("Max order for equidistant sampling: $(order)")
X = Vector{Float64}[]
Y = Float64[]
for i in 1:length(df.ฮ0)
for j in 1:length(df.ฮests[i])
vec = [df.ฮ0[i], df.ฮests[i][j]]
push!(X, vec)
push!(Y, df.Eps[i][j])
end
end
c_ep = chebregression(X, Y, (order, order))
My problem was to understand how to build the X and Y vectors, but now I understand it.
If you do not use Chebyshev points or nodes there is the risk that you get weird oscillations, but only if the order of your approximation is too high.
Generally, when using m equidistant points, if N<2 \sqrt{m} then the least squares
approximation P_N(x) is well-conditioned. (Runge's phenomenon - Wikipedia)
Also worth reading: Chebyshev nodes - Wikipedia
For a polynom of the order 5x5 I would need only 36 functions calls when using Chebyshev points, but I need 49 calls with an equidistant grid. I might still implement this, but first I had to make the conversion from my data frame to the X and Y vectors workโฆ
I use this data frame because it makes it easy to plot the data, which represents simulation results with a set of lines.