I am very new in Julia, so sorry in advance for my questions.
I have some issues about the optimize function from Optim.jl package.
First of all, I was not able to find a complete list with description of every optimization algorithm that I can use. I know that Nelder Mead is the default one gradient-free, and that LBFGS is the default one gradient-prone, but I would like to study every available algorithm to decide which one I could use, without finding a list in the documentation.
An othe issue, is that I would like to apply a simple constraint without using gradient and/or hessian. Actually, to be more precise, I just need a solution for the optimization which gives me all positive values.
This is the function I would like to optimize, to find the parameters x:
T = 0.0
partial_sum1 = 0.0
partial_sum2 = 0.0
regularisation_factor = x[end]
regularisation = norm(x[1:end-1])
for i = 1:44
numerator = 0.0
denominator = 0.0
for j = 1:44
numerator += Q[i,j]x[j]
denominator += (Q[i,j]+M[i,j])x[j]
partial_sum1 = numerator/denominator
partial_sum2 = smooth_intronic_average[i]/(2smooth_intronic_average[i]+smooth_exonic_average[i])
T += (partial_sum1 - partial_sum2)^2
where smooth_intronic_average and smooth_exonic_average are two arrays of length 44, and Q and M are two matrices (44 x 50).
That’s what I have done so far:
optimization = optimize(A, initial_guess)
with different initial_guess. In any case, I had negative values in any case, more or less, that are not good for the scientific purpose I am using this function (the x values should be transcription rates of a gene expression, that cannot be negative).
Thank you to everyone for your help.