Optim and differential equations?

OK – I’ve tested some options. First, a somewhat nicer plot of the scalar variation in the model fit loss function for my 14 parameters (they have been scaled so that the initial guess would be 1.0):

Next, I consider scalar variation in parameter 7 = loss function in element (2,3) of the plot matrix (“UAs2Fe”).

Using a univariate method (Brent’s method), I get:

julia> loss = (par) -> cost(par,[7])
julia> optimize(loss,0.5,1.5)
Results of Optimization Algorithm
 * Algorithm: Brent's Method
 * Search Interval: [0.500000, 1.500000]
 * Minimizer: 5.955942e-01
 * Minimum: 3.981820e-02
 * Iterations: 32
 * Convergence: max(|x - x_upper|, |x - x_lower|) <= 2*(1.5e-08*|x|+2.2e-16): true
 * Objective Function Calls: 33

If I use multivariable methods (Fminbox), with default solver I get:

julia> optimize(loss,[0.5],[1.5],[1.0],Fminbox())
Results of Optimization Algorithm
 * Algorithm: Fminbox with L-BFGS
 * Starting Point: [1.0]
 * Minimizer: [0.7577163938258104]
 * Minimum: 6.411025e-02
 * Iterations: 3
 * Convergence: true
   * |x - x'| ≤ 0.0e+00: true 
     |x - x'| = 0.00e+00 
   * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: true
     |f(x) - f(x')| = 0.00e+00 |f(x)|
   * |g(x)| ≤ 1.0e-08: false 
     |g(x)| = 1.18e+01 
   * Stopped by an increasing objective: true
   * Reached Maximum Number of Iterations: false
 * Objective Calls: 500
 * Gradient Calls: 500

Here, the result is quite wrong…

Alternatively, I try to use GradientDescent, leading to failure:

julia> optimize(loss,[0.5],[1.5],[1.0],Fminbox(GradientDescent()))
┌ Warning: Linesearch failed, using alpha = 8.32126246806568e-12 and exiting optimization.
│ The linesearch exited with message:
│ Linesearch failed to converge, reached maximum iterations 50.
└ @ Optim C:\Users\user_name\.julia\packages\Optim\Agd3B\src\utilities\perform_linesearch.jl:47
┌ Warning: f(x) increased: stopping optimization
└ @ Optim C:\Users\user_name\.julia\packages\Optim\Agd3B\src\multivariate\solvers\constrained\fminbox.jl:293

Results of Optimization Algorithm
 * Algorithm: Fminbox with Gradient Descent
 * Starting Point: [1.0]
 * Minimizer: [0.5627850567743929]
 * Minimum: 4.219153e-02
 * Iterations: 2
 * Convergence: false
   * |x - x'| ≤ 0.0e+00: false 
     |x - x'| = 2.67e-11 
   * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
     |f(x) - f(x')| = 5.59e-11 |f(x)|
   * |g(x)| ≤ 1.0e-08: false 
     |g(x)| = 3.21e+00 
   * Stopped by an increasing objective: true
   * Reached Maximum Number of Iterations: false
 * Objective Calls: 305
 * Gradient Calls: 305

Even with a crash, the result is better than with the L-BFGS algorithm.

Finally, with the BlackBoxOptim default algorithm:

julia> res = bboptimize(loss; SearchRange = (0.5,1.5), NumDimensions = 1, TraceMode = :silent)
julia> best_candidate(res),best_fitness(res)
([0.596312], 0.03959384553988452)

which is rather similar to that of Brent’s scalar method.