Manifold optimization with additional constraints

I want to minimize f(x), with two constraints: (1) x is a unitary matrix, and (2) f_i(x) >= 0 for i=1, …, N. (In my case, f_i(x) are quadratic functions, e.g. abs(x[1, 1])^2) - 0.5.

Because of the unitarity constraint, I want to use the manifold optimization in Optim.jl. But as I understand, one cannot impose additional constraints like f(x) >= 0 directly in Optim.jl. Is this correct?

A solution I thought is to implement an augmented Lagrangian method by myself, using Optim.jl as the solver inside it. But I was wondering if there is a simpler or better way.

Any suggestions will be much appreciated!

This might be better suited to Convex.jl:

https://jump.dev/Convex.jl/stable/examples/optimization_with_complex_variables/Fidelity%20in%20Quantum%20Information%20Theory/#Fidelity-in-quantum-information-theory

You could also use JuMP with something like

N = 3
using JuMP, Ipopt
I = zeros(N, N)
for i in 1:N
    I[i, i] = 1.0
end
model = Model(Ipopt.Optimizer)
@variable(model, x_re[i=1:N, j=1:N], start = I[i, j])
@variable(model, x_im[1:N, 1:N], start = 0.0)
@expression(model, x, x_re .+ x_im)
@expression(model, xt, (x_re .- x_im)')
@constraint(model, xt * x .== I)
optimize!(model)

but JuMP doesn’t support complex values, so you need to model both the real and imaginary parts.