I am coming from MATLAB/Python but Julia is becoming more attractive, especially for higher precision. Currently I am just trying out some toy problems to familiarize myself with Julia, JuMP, and associated solvers.
I am trying to minimize the operator norm (spectral/inf-schatten norm) of a density matrix \rho, subject to the constraint that some (fixed) test POVM element F_i is valid: tr(F_i\rho)\in[0, 1].
Quantum info jargon aside, the important part here is that \rho has the properties of being a Hermitian and positive semi-definite 2x2 matrix with trace 1. It has complex entries. F_i can be some fixed 2x2 matrix that constrains \rho. Here is my first attempt at doing so with with JuMP.
using LinearAlgebra
using JuMP
using Hypatia
model = Model(() -> Hypatia.Optimizer(verbose = false))
#Introduces rho (H) as a 2d PSD Hermitian
@variable(model, H[1:2, 1:2] in HermitianPSDCone())
#Unit trace constraint
@constraint(model, LinearAlgebra.tr(H) == 1)
#op norm
@variable(model, t)
@constraint(model, [t; vec(H)] in MOI.NormSpectralCone(2, 2))
#A test POVM element
F = [1+0*im 0; 0 0]
#Probability must be a probability
@constraint(model, real(LinearAlgebra.tr(F * H)) in MOI.Interval(0, 1))
#Minimize the op norm
@objective(model, Min, t)
println("Starting solver.")
optimize!(model)
println(termination_status(model))
println(objective_value(model))
println(value.(H))
However, I obtain the error:
ERROR: LoadError: MethodError: no method matching promote_operation(::typeof(vcat), ::Type{Float64}, ::Type{MathOptInterface.ScalarAffineFunction{ComplexF64}}, ::Type{Float64})
The function `promote_operation` exists, but no method is defined for this combination of argument types.
...
Can JuMP handle complex numbers in this case?