Constraint optimization without gradient

Hello,
I am very new in Julia, so sorry in advance for my questions.
I have some issues about the optimize function from Optim.jl package.
First of all, I was not able to find a complete list with description of every optimization algorithm that I can use. I know that Nelder Mead is the default one gradient-free, and that LBFGS is the default one gradient-prone, but I would like to study every available algorithm to decide which one I could use, without finding a list in the documentation.

An othe issue, is that I would like to apply a simple constraint without using gradient and/or hessian. Actually, to be more precise, I just need a solution for the optimization which gives me all positive values.

This is the function I would like to optimize, to find the parameters x:

function A(x)
T = 0.0
partial_sum1 = 0.0
partial_sum2 = 0.0
regularisation_factor = x[end]
regularisation = norm(x[1:end-1])
for i = 1:44
numerator = 0.0
denominator = 0.0
for j = 1:44
numerator += Q[i,j]x[j]
denominator += (Q[i,j]+M[i,j])x[j]
end
partial_sum1 = numerator/denominator
partial_sum2 = smooth_intronic_average[i]/(2smooth_intronic_average[i]+smooth_exonic_average[i])
T += (partial_sum1 - partial_sum2)^2
end
T +=regularisation_factorregularisation^2
return T
end

where smooth_intronic_average and smooth_exonic_average are two arrays of length 44, and Q and M are two matrices (44 x 50).
That’s what I have done so far:

optimization = optimize(A, initial_guess)

with different initial_guess. In any case, I had negative values in any case, more or less, that are not good for the scientific purpose I am using this function (the x values should be transcription rates of a gene expression, that cannot be negative).

Thank you to everyone for your help.

S.

Please read PSA: how to quote code with backticks to see how you can quote your code to make it more readable.

1 Like

Have you tried with: https://github.com/JuliaOpt/NLopt.jl/blob/master/README.md?

I have used in R for free derivative optimization an it works amazingly great.

If you really want to do nonlinear constrained optimization without gradients, I would recommend BlackBoxOptim.jl (box constraints only) or NLopt.jl. Optim.jl doesn’t do constrained (right now) and most of its methods use gradients.

1 Like