Nelder Mead requires derivatives?

Hello everyone

I was playing around with optimizing a problem using some code that looks as follows:

using Optimization
using OptimizationOptimJL

function cost(u,p)
    # define the cost function here
end

prob = Optimization.OptimizationProblem(
    Optimization.OptimizationFunction(cost), 
    [2e-5], #u0
    (p1, p2, p3, p4), # p
    lb=[1e-8], ub=[1e-4]
)
ρ = OptimizationOptimJL.solve(prob, NelderMead(), maxiters=50, maxtime=10^5)

But when I try to run it I get the following:

ERROR: ArgumentError: Fminbox(NelderMead{Optim.AffineSimplexer, Opt
im.AdaptiveParameters}(Optim.AffineSimplexer(0.025, 0.5), Optim.Ada
ptiveParameters(1.0, 1.0, 0.75, 1.0))) requires gradients, use `Opt
imizationFunction` either with a valid AD backend https://docs.scim
l.ai/Optimization/stable/API/ad/ or a provided 'grad' function.    

Not sure why it would ask for derivatives here.

I tried playing around with NelderMead(parameters = FixedParameters()) but then I get Fixed parameters is not defined in Main.

Am I missing something?

Derivatives in this case are impossible as far as I’m aware, since the cost function involves a big random walk algorithm. I don’t need to use Nelder Mead necessarily, so I’m open to any alternative suggestions.

I encountered this issue before too. I think the problem is that Optimization is wrapping Optim’s NelderMead with Fminbox, and that results in needing derivatives.

1 Like

Thanks for the input!

Might try switching to Optim.jl, since Brent or Golden section might be better methods for this application.

Would be good if the issue above is fixed anyway, seems like a bug to me.

1 Like

I would recommend opening an issue on the Optimzation.jl github page

3 Likes

@Vaibhavdixit02 might be the person to ask here

1 Like

Done :white_check_mark:

1 Like

Yeah it’s an Optim FMinBox thing. We should solve it though because it’s not fun. But using the PRIMA methods seem to generally be better (we’ll get a good set of benchmarks up on this soon)

1 Like