Hello, I am new to Julia but not to optimization. I have implemented a large direct collocation constrained optimization NLP problem in MATLAB, it works but is a bit slow for my taste (takes about 5 minutes on average). I am trying to implement the same NLP problem in Julia using Optimization + Optim + ForwardDiff packages (IPNewton is the solver I use, since this is a constrained optimization problem, plus my constraints are non-symbolic, user-defined and nonlinear). My question here is, is Julia right for me? I seemed to have a successful implementation syntactically and semantically in Julia, but the optimizer runs forever. To let you know of the size of the problem, I have 1800 x 1 input vector X, and 1686 x 1 nonlinear equality plus inequality constraint vector. The program often runs out of memory trying to compute the hessian tensor or if doesn’t, it runs forever. Here is a snippet of optimization code:
function optimizer(X0, params, lb, ub)
objective = OptimizationFunction(cost, Optimization.AutoForwardDiff(1), cons = constraints)
# objective = OptimizationFunction(cost, Optimization.AutoFiniteDiff(), cons = constraints)
collocation_params = create_common_parameters_for_collocation()
merge!(params, collocation_params)
lcons, ucons = get_constraint_bounds(params)
problem = OptimizationProblem(objective, X0, params, lb = lb, ub = ub, lcons = lcons, ucons = ucons, sense = Optimization.MinSense)
@time sol = solve(problem, IPNewton(), maxiters = 10, show_trace = true, extended_trace = true, show_every = 1, maxtime = 60)
return sol
end
Note that solver is either not finishing even one iteration or it’s just ignoring the options I specified to provide an output trace.
I greatly appreciate any direction from seasoned Julia engineers on this. I really had high hopes for reducing the average runtime for my NLP problem in Julia, but this is a completely counter-intuitive result that I am seeing.
Thanks