NLopt optimizer just return the initial value

opt = Opt(:LD_SLSQP,4)
opt.lower_bounds = [0,0,0,0]
opt.ftol_abs = 1e-20
opt.min_objective = risk_budget_objective
equality_constraint!(opt,weights -> total_weight_constraint(x))
(optf,optx,ret) = optimize(opt,[0.25,0.25,0.25,0.25]) 

Sorrry for asking a silly question…
There must be something wrong with the arguments passed to the function but I couldn’t figure it out
The returned value of optx and numevals are the initial value and 1, relatively.
Thank you for your patience and kindness to read this.

The following is for reference, if needed:

using CSV,DataFrames,StatsBase,Statistics,LinearAlgebra,NLopt

df = DataFrame(CSV.File("D:\\Julia\\data.csv";select=["000852.SH","H11008.CSI","AU9999.SGE","CBA00301.CS"]))
dropmissing(df)
X = Matrix(df)
dfcov = cov(X)

function risk_budget_objective(weights::Vector,cov)
    cov = dfcov
    sigma = sqrt.(weights'*(cov'*weights))
    MRC = (cov'*weights)./sigma
    TRC = weights .* MRC
    delta_TRC = [sum((i .- TRC).^2) for i in TRC]
    return sum(delta_TRC)
end

function total_weight_constraint(x::Vector)
    return sum(x)-1.0
end

opt = Opt(:LD_SLSQP,4)
opt.lower_bounds = [0,0,0,0]
opt.ftol_abs = 1e-20
opt.min_objective = risk_budget_objective
equality_constraint!(opt,weights -> total_weight_constraint(x))
(optf,optx,ret) = optimize(opt,[0.25,0.25,0.25,0.25]) 

equality_constraint!(opt,weights → total_weight_constraint(x))

This part isn’t correct. Your constraint functions need to compute the gradient. I didn’t test because I don’t have the data.csv, but this should point you in the right direction:

function total_weight_constraint(x::Vector, grad::Vector)
    if length(grad) > 0
        grad .= 1.0  # Needs to be the actual gradient. Change if your function changes
    end
    return sum(x) - 1.0
end
equality_constraint!(opt, total_weight_constraint)
1 Like

Thank you for replying. I added gradient argument to my constraint function, however the optimization function is still returning the initial value. Should I also use gradient argument in my objective function?

Oops. Yes, I didn’t look at the objective. You need to follow:

The objective function must be a function which accepts two arguments, x and grad, and if length(grad) > 0, fills in grad with the gradient of the function.

1 Like