Newton optimization with Optim doing strange things

Floats don’t work like that:

julia> 10.0^100
1.0e100

julia> 10.0^1000
Inf

As far as I can tell Optim takes a step, the linesearch tells it that the step is fine, it continues. But the second step is in a region where f(x1) is smaller that f(x0) but your gradient/hessian are NaNs, which I imagine is because x1 goes outside of the “physical” parameter space for x.

Also, the Hessian at convergence of the second algorithm is not positive definite, which means that there’s something fishy in your problem. If I had to guess, from the structure of the Hessian I’d say that you’re minimizing a Lagrangian?

Your problem seems quite weird. That gradient with sup-norm 6.0e301 after the first step with step size 1.0? Something seems very wrong. When taking the second step your first element of the gradient is 5.44e180 - no wonder you’re experiencing weird steps here!

PF might not be perfect, but are you sure everything here is actually correct?

What could be the problem?

So what is the trace for Newton with \ ?

Also, what exactly is the thing you’re minimizing here?

Yes I am minimizing a lagrangian.

The non-positive factorization trace is also in my post.

Knew it! :slight_smile:

Well, don’t. That’s not how constrained optimization works. The minimum of the Lagrangian is -Inf, pretty much always. The point you’re looking for is a saddle point (not a minimum) of the Lagrangian, which you can get by solving the nonlinear system of equations \nabla L = 0 using for instance NLSolve, or, better yet, specialized constrained optimization packages.

1 Like

So what Antoine is saying here is correct, and the fact that PF doesn’t work and \ does work is merely luck I think. Neither is supposed to work, as the line search and the whole setup in general will try to get you down hill.

What packages do you guys recommend?

JuMP interface is complete mess compared to optim

Have you tried NLopt.jl?

May I propose ConstrainedOptim? It has not been tested on many users yet, but I’d be keen on seeing how you find it. (We are hoping to move it into Optim “in the future”)

If the constraints of your original problem are linear, it will be quite easy to set up the problem and the constraints.

1 Like

I will check both (NLopt and ConstrainedOptim) over the weekend. I’m leaning towards NLopt because i want to be able to run the optimization with or without inputing the hessian

Fair enough re Hessian. We definitely need to get the ball rolling on implementing a quasi-Newton interior point algorithm.

1 Like

What do your (current and future) optimization problems and constraints look like? What dimensions, number of constraints, and type of constraints are we talking about? Do you have a reference for us?

2 Likes

This is key if you want more help no matter the package to be used.

2 Likes

I think this is a majorly undermentioned point with mathematical OSS. Seriously, the best things ever are tests and benchmarks. If you want packages to optimize for your problem and keep it working, donate some “integration tests” related to your domain along with some benchmarks in whatever format the package authors are looking for (and give a description of what it is for future citations). I think most authors will love to have a nice new test problem donated to them, even if it starts as @test_broken. Just make sure you abstract out any extraneous dependencies and boil it down to the key features of the problem and it’s a good addition to any test suite.

4 Likes