Here is a modified example to fit a curve, specifically `log`

on the domain `[ 3/4, 3/2 ]`

.

This will use a degree 9 polynomial where the first two coefficients are explicitly set to { 0, 1 } respectively and the simple transform `f=x-1`

is used because of the root at `x=1`

.

```
using JuMP, GLPK
using Gadfly
# simple domain
dom = ( 0.75, 1.5 )
xvals = collect(range( dom[1], dom[2]; length=101 ));
yvals = log.( xvals );
function fit_poly_inf_norm(x,y,n)
m = Model(GLPK.Optimizer)
@variable(m, alpha[1:n+1])
@variable(m, t >= 0.)
@constraint(m, alpha[1] == 0 );
@constraint(m, alpha[2] == 1 );
@constraint(m, [i=1:length(y)], sum(alpha[j]*x[i]^(j-1) for j in 1:n+1) - y[i] <= t );
@constraint(m, [i=1:length(y)], -t <= sum(alpha[j]*x[i]^(j-1) for j in 1:n+1) - y[i] );
@objective(m, Min, t)
optimize!(m)
coeffs = JuMP.value.(alpha)
err = JuMP.value.(t)
return (coeffs, err)
end
degree = 9
# root at x=1 so approximate with f=x-1
(coeffs,err) = fit_poly_inf_norm( xvals .- one(dom[1]), yvals, degree )
```

which gives results.

```
( [0.0, 1.0, -0.5000011437016926, 0.33333600206898395, -0.2499042550870806, 0.19972612897284364, -0.16863298896795648, 0.1506453512060692, -0.1197307463685523, 0.05204078783867752], -2.1019126300592106e-9 )
```

Note that the coefficients (after the zero) have alternating sign.

Its my understanding that adding constraints to force this should be redundant, but the following

```
function fit_poly_inf_norm2(x,y,n)
m = Model(GLPK.Optimizer)
@variable(m, alpha[1:n+1])
@variable(m, t >= 0.)
@constraint(m, alpha[1] == 0 );
@constraint(m, alpha[2] == 1 );
@constraint(m, alpha[3] <= 0 );
@constraint(m, 0 <= alpha[4] );
@constraint(m, alpha[5] <= 0 );
@constraint(m, 0 <= alpha[6] );
@constraint(m, alpha[7] <= 0 );
@constraint(m, 0 <= alpha[8] );
@constraint(m, alpha[9] <= 0 );
@constraint(m, 0 <= alpha[10] );
@constraint(m, [i=1:length(y)], sum(alpha[j]*x[i]^(j-1) for j in 1:n+1) - y[i] <= t );
@constraint(m, [i=1:length(y)], -t <= sum(alpha[j]*x[i]^(j-1) for j in 1:n+1) - y[i] );
@objective(m, Min, t)
optimize!(m)
coeffs = JuMP.value.(alpha)
err = JuMP.value.(t)
return (coeffs, err)
end
(coeffs2,err2) = fit_poly_inf_norm2( xvals .- one(dom[1]), yvals, degree )
```

give the different results

```
( [0.0, 1.0, -0.5000012569584985, 0.33333324995783115, -0.24988908708604318, 0.19981205548969036, -0.16904719456430753, 0.15022553801306932, -0.11640673034878432, 0.04871565717295707], 1.2662657797313237e-9 )
```

Can anybody explain why the results change when adding redundant constraints ?

[ Julia 1.5.3 and 1.6.0-rc3 give same results. ]