# Optim.jl - How to find local maxima of multivariate function

Pardon my ignorance (if you’ve seen any recent posts of mine you’ll know I’ve been studying calculus lately) but I’m trying to understand how to find local maxima of a multivariate function with Optim.jl. Given the following function, it’s pretty easy to pick a starting point and let Optim work its magic to find local minima:

``````using Optim
using Plots
using Plots.PlotMeasures
pyplot(size=(1200,600))

f(x,y) = -5*x*y*exp(-x^2-y^2)
res = optimize(x -> f(x,x), [1.0,1.0])
res2 = optimize(x -> f(x,x), [-0.75,-0.75])

# Contour plot with local minima added
p = contour(
x,
y,
z,
color=:blues,
legend=false,
xlabel="x",
ylabel="y"
)
plot!([Optim.minimizer(res)], [Optim.minimizer(res)], marker=:circle)
plot!([Optim.minimizer(res2)], [Optim.minimizer(res2)], marker=:circle)
``````

In this case, I can obviously just use the same values from `res` and `res2` by switching the sign for the `y` value (`-(Optim.minimizer(res))` but is there a way to have it find local maxima just like it does for local minima? The docs seem to only discuss minimizing functions.

Multiply the objective be `-1`.

Maximizing `f(x)` is the same as minimizing `-f(x)`.

2 Likes

Thanks! I actually did that after realizing myself it was the same. It would be nice to have a keyword argument that allows you to tell it to maximize rather than minimize but maybe there’s a good reason why that doesn’t exist.

Instead, I often wish that the function name was `minimize` instead of `optimize`, so that I don’t have to keep looking up the docs (each library does something different).

Then

``````maximize(f, args...) = minimize(x -> -f(x), args...)
``````

would be a trivial wrapper (of course for AD etc it would be more complicated).

5 Likes