Solving a system of multivariate polynomials (or finding extrema)

I have a multivariate (in practice, 2–5 dimensions) function approximated with Chebyshev polynomials. I would like to find all the local extrema of the approximation. This is equivalent to solving the system obtained from partial derivatives, but there may be a more direct method (also, dealing with extrema at endpoints, these are bounded on [-1,1]^n.

Specifically, I have the approximation available as either weighted sums of a Chebyshev basis, or if preferable vectors of

(i j k ...) => c

where c would be the coefficient assigned to x_1^i x_2^j x_3^k \dots.

Is this a tractable problem? I only know a method for the unviariate case (Boyd 2013).

If yes, what would be the recommended way in Julia?

1 Like

maybe https://www.juliahomotopycontinuation.org

1 Like

I’m pretty sure Chebfun can do this in 2D.

1 Like

Yes, it can. But I am not sure what the algorithm is and if there is a Julia equivalent.

I asked Alex Townsend and he said it uses subdivision + Bezout resultants. I’m not sure it’s useful beyond 2D.

2 Likes

Assuming the number of solutions is finite, if it is possible to write out the equations in a standard monomial basis, you can maybe try Algebraic Systems Solving Β· AlgebraicSolving.jl (algebraic-solving.github.io)

2 Likes