# Problem optimizing a 2 variable function (Optim .jl)

Hi! I want to optimize a 2 variable function using Optim.jl. I’ve read the documentation but I still can’t figure it out. I hope someone can help me. (I’m using Optim and using MittagLeffler on a Jupyter notebook with Julia 1.5.3).

I have two arrays of data x_1 and y_1. I have defined the following function which I want to optimize:

function distancia2(α, m)

distancias = 0.0

for j in 1:length(x_1) #length(x) = length(y)

distancias += (9.387683485474406*mittleff(α, m*x_1[j]) + 71.89597427356132 - y_1[j])^2

end

return distancias


end

which is basically the square residues of the Mittag-Leffler function to the set of data points that I have.

I want to find \alpha and m such that minimize distancias2(\alpha, m). I have tried: optimize(distancia2, [0.9900284232219172, -0.0067826190463882875]) (which are my initial guesses), but get the following error:

MethodError: no method matching distancia2(::Array{Float64,1})
Closest candidates are:
distancia2(::Any, ::Any) at In[24]:3

Stacktrace:
[1] value!!(::NonDifferentiable{Float64,Array{Float64,1}}, ::Array{Float64,1}) at C:\Users\progr.julia\packages\NLSolversBase\geyh3\src\interface.jl:9
[4] optimize(::Function, ::Array{Float64,1}; inplace::Bool, autodiff::Symbol, kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at C:\Users\progr.julia\packages\Optim\3K7JI\src\multivariate\optimize\interface.jl:90
[5] optimize(::Function, ::Array{Float64,1}) at C:\Users\progr.julia\packages\Optim\3K7JI\src\multivariate\optimize\interface.jl:84
[6] top-level scope at In[29]:1

Any help would be truly appreciated!!

notice that Optim optimizes functions defined over vectors. An easy way to modify your function would be:

function distancia3(x)
α = x[1]
m = x[2]
distancia2(α, m)
end


If you are worried about performance, you’d want to read the performance tips in the documentation.
Finally, perhaps you’d also would want to know about ComponentArrays.jl. I’m a big fan of that package.

4 Likes

Thanks a lot! I copied that code and it works perfectly. I’m only worried about the fact that I’m not getting a candidate solution back. An example of an output is the following:

• Status: success

• Candidate solution
Final objective value: 5.038829e+00

• Found with

• Convergence measures
√(Σ(yᵢ-ȳ)²)/n ≤ 1.0e-08

• Work counters
Seconds run: 1 (vs limit Inf)
Iterations: 42
f(x) calls: 83

Why is this?

Do you mean the optimal values? Optim returns an object that has those values, you can extract them like this:

res = optimize(...)
Optim.minimizer(res)


The documentation lists everything inside the object returned.

4 Likes

Perfect. You are very kind.

1 Like