Multivariate polynomial regression of discrete data in L-infinity norm

If I understand correctly what you are doing, what you want is a lot easier than the question the other posters seem to think they are answering.

Packages like Remez.jl are solving the continuous minimax problem in which one wants to minimize \Vert p(x) - f(x) \Vert_\infty over a continuous interval x \in [a,b], whereas it sounds like you only want to minimize the L_\infty error over a discrete set of data points.

In particular, suppose you have a set of m data points \{ x_k, y_k \}_{k = 1,\ldots, m} and you want to find a degree-n polynomial p(x) that minimizes the L_\infty error at those points:

\min_p \left[ \max_k |p(x_k) - y_k| \right]

This problem is straightforward to solve because it is equivalent to linear programming (LP). In particular, the “epigraph” reformulation into an LP works by adding an additional real parameter t \in \mathbb{R}

\min_{p \in \text{polynomials}, t\in \mathbb{R}} t \\ \text{s.t. }\qquad t \ge p(x_k) - y_k, \; t \ge y_k - p(x_k) \qquad \text{for } k = 1,\ldots,m

(This is an LP because p(x_k) is a linear function of the polynomial coefficients in whatever basis. That’s true regardless of the dimensionality of x.)

So, you can use any LP solver in one of Julia’s many convex-optimization packages to solve this.

PS. People may have been confused by original the title of your post, “Function Approximation in L-infinity Norm”, whereas you are really approximating data (i.e. doing regression, i.e. discrete domain), not functions on a continuous domain. I’ve edited the thread title accordingly.

6 Likes