How to approach my unconstrained optimization problem?

I’ve got an objective function to maximize. The domain is {\mathbb{R}}^n, where n is about thirty. The codomain is a bounded subset of the nonnegative reals. The objective value is zero for most of the domain. The objective may only be positive within a known small subset of the domain.

I don’t strictly need the global maximum, but I want to find as good of an optimal value as I can manage.

The objective is not a black box, it’s Julia code I’ll write, but I somehow don’t think it’d be very amenable to autodiff, so I guess I’ll have to use finite differences. I can’t share the objective.

The objective has discontinuities where the value jumps from zero to a positive value. There also may be some nondifferentiable regions within the region where the objective may be positive, not sure yet.

I suppose there are regions within the domain where the function is hopefully differentiable, at least in some of the thirty-ish variables, so I’m hoping something like gradient descent could be helpful to improve whatever objective value I find by random uniform sampling. Does this make sense?

Questions:

  1. I can’t plot the objective, given that there are about thirty dimensions, is there some numerical or other way to assess how helpful gradient descent could be here?
  2. What Julia packages could be helpful? There are quite a lot of registered packages doing gradient-based optimization.
  1. You can visualize cross-sections. If you know that the objective is differentiable with respect to one variable x, you can fix the remaining variables and plot the objective value for several values of x.
  2. I would try BlackBoxOptim.jl and/or PRIMA.jl.

Can you tell if a solution is good? If so, I would just try the two packages and see how they do.

1 Like

Thanks I’ll look into your suggestions once I implement the objective!

Yeah, bigger is better.

Both of these and others (including my personal favorite CMAEvolutionStrategy) are wrapped by the meta-package Optimization.jl. This makes it easy to try a number of different optimizers with the same API.

1 Like

Maybe these methods are useful in your case: GitHub - JuliaIntervals/IntervalOptimisation.jl: Rigorous global optimisation in pure Julia

I was quite impressed last time I checked now robust they are even for very ill formed functions.

2 Likes