SumOfSquares with BigFloat

I’m trying to use SumOfSquares with arbitrary precision. Is this possible?

if I do

model = GenericModel{BigFloat}(COSMO.Optimizer{BigFloat})
#...
@constraint(model, con1 in SOSCone())

where con1 is a

Polynomial{DynamicPolynomials.Commutative{DynamicPolynomials.CreationOrder}, Graded{LexOrder}, GenericAffExpr{BigFloat, GenericVariableRef{BigFloat}}}

Then I get a weird error

ERROR: MethodError: no method matching promote_operation(::typeof(-), ::Type{…}, ::Type{…}, ::Type{…})
Stacktrace:
[1] concrete_bridge_type(::Type{…}, F::Type{…}, ::Type{…})

Replacing BigFloat with Float64 works fine. Should this be supported? Or am I misunderstanding?

Hi @jezzaparker, I don’t think so. SumOfSquares.jl was written before JuMP gained support for other number types.

@blegat can confirm.

Thanks for reporting the issue. It doesn’t seem too hard to fix, see Fix bridge choice for non-Float64 by blegat · Pull Request #356 · jump-dev/SumOfSquares.jl · GitHub
SumOfSquares is going over a big refactoring though so you’ll need to checkout the dev branch of a few packages to try it out.
It shouldn’t be too hard to backport the fix if needed.

2 Likes