I want to minimize f(x), with two constraints: (1) x is a unitary matrix, and (2) f_i(x) >= 0 for i=1, …, N. (In my case, f_i(x) are quadratic functions, e.g. abs(x[1, 1])^2) - 0.5.
Because of the unitarity constraint, I want to use the manifold optimization in Optim.jl. But as I understand, one cannot impose additional constraints like f(x) >= 0 directly in Optim.jl. Is this correct?
A solution I thought is to implement an augmented Lagrangian method by myself, using Optim.jl as the solver inside it. But I was wondering if there is a simpler or better way.
Any suggestions will be much appreciated!
This might be better suited to Convex.jl:
You could also use JuMP with something like
N = 3
using JuMP, Ipopt
I = zeros(N, N)
for i in 1:N
I[i, i] = 1.0
model = Model(Ipopt.Optimizer)
@variable(model, x_re[i=1:N, j=1:N], start = I[i, j])
@variable(model, x_im[1:N, 1:N], start = 0.0)
@expression(model, x, x_re .+ x_im)
@expression(model, xt, (x_re .- x_im)')
@constraint(model, xt * x .== I)
but JuMP doesn’t support complex values, so you need to model both the real and imaginary parts.