This is a breaking release to Convex.jl which switches the backend from the deprecated MathProgBase to MathOptInterface (usually shortened to MOI). This means that one should use MOI-compatible solvers. For example, if you previously solved a problem `p`

with

```
using Convex, ECOS
p = …
solve!(p, ECOSSolver())
```

then you should change the `solve!`

command to

```
solve!(p, ECOS.Optimizer)
```

To pass argument to the optimizer, create a closure, e.g.

```
solve!(p, () -> ECOS.Optimizer(verbose=0))
```

There are a few more smaller changes listed in the NEWS file.

This has change has a few benefits.

## Performance

First, MOI can be much more performant than MPB (see e.g. v0.13 now as fast as R/CVXR, also scales linearly (v0.12 was quadratic) · Issue #353). Note that compile times seem to be longer via MOI than they were via MPB, however.

## Arbitrary numeric types

Second, MOI supports arbitrary numeric types. This opens the door to solving optimization problems in high precision, for example. There are only a few high-precision solvers, but now you can use them easily. There are three important steps to solving a problem in high-precision.

- First, formulate your problem in high-precision. For example, instead of
`x = sqrt(2) * y`

, write`x = sqrt(big(2)) * y`

. - Second, pass the keyword argument
`numeric_type`

to the problem construction call. For example,

The`p = minimize(objective; numeric_type = BigFloat)`

`numeric_type`

argument tells Convex.jl what type of number to use when reformulating the problem internally, and what type of number it should tell MathOptInterface to use. Note that if you use complex numbers, you should still pass the real version to`numeric_type`

. - Then call the
`solve!`

command with a high-precision solver. The two that I know about are Tulip.jl, a pure Julia linear program solver, and SDPAFamily.jl which calls the C++ solvers SDPA-GMP, SDPA-DD, and SDPA-QD which are for solving semi-definite optimization problems (SDPs). For example,`SDPAFamily`

provides the optimizer`SDPAFamily.Optimizer`

which expects BigFloats. To use another numeric type`T`

, you can do`SDPAFamily.Optimizer{T}`

. Tulip provides`Tulip.Optimizer`

which defaults to Float64, but you can pass e.g.`Tulip.Optimizer{BigFloat}`

.

We can see a simple example of all of this to solve a trivial SDP with SDPAFamily:

```
using Convex, SDPAFamily, Test
y = Semidefinite(3)
p = maximize(eigmin(y), tr(y) <= sqrt(big(2)); numeric_type = BigFloat)
solve!(p, () -> SDPAFamily.Optimizer(presolve=true))
@test p.optval ≈ sqrt(big(2))/3 atol=1e-30
```

## Development

The last benefit is that by switching the backend to MOI, Convex.jl is positioned better for future development. We can lower the user’s code more directly to MOI constructs for improved performance, and we can use MOI’s bridging mechanism instead of atoms in some cases. MOI has a lot of strong development behind it, and Convex can benefit from that!