I tried to use multistart optimization with LBFGS using packages: MultiStartOptimization.jl. However, I found the code demo in the website (MultistartOptimization.jl · Optimization.jl) yields error like following messages:
(base) xx@node-r5-he01:~/xx/examples$ "$HOME/bin/julia-home" --compiled-modules=no
_
_ _ _(_)_ | Documentation: https://docs.julialang.org
(_) | (_) (_) |
_ _ _| |_ __ _ | Type "?" for help, "]?" for Pkg help.
| | | | | | |/ _` | |
| | |_| | | | (_| | | Version 1.11.5 (2025-04-14)
_/ |\__'_|_|_|\__'_| | Official https://julialang.org/ release
|__/ |
pkg> activate ./
Activating new project at `/xx/examples`
(examples) pkg> add Optimization OptimizationMultistartOptimization OptimizationNLopt
Resolving package versions...
Updating `/xx/examples/Project.toml`
[7f7a1694] + Optimization v5.0.0
[e4316d97] + OptimizationMultistartOptimization v0.3.2
[4e6fcdb7] + OptimizationNLopt v0.3.5
Updating `/xx/examples/Manifest.toml`
[47edcb42] + ADTypes v1.18.0
[1520ce14] + AbstractTrees v0.4.5
[7d9f7c33] + Accessors v0.1.42
[79e6a3ab] + Adapt v4.4.0
[dce04be8] + ArgCheck v2.5.0
[4fba245c] + ArrayInterface v7.20.0
[fa961155] + CEnum v0.5.0
[38540f10] + CommonSolve v0.2.4
[a33af91c] + CompositionsBase v0.1.2
[88cd18e8] + ConsoleProgressMonitor v0.1.2
[187b0558] + ConstructionBase v1.6.0
[8bb1440f] + DelimitedFiles v1.9.1
[a0c0ee7d] + DifferentiationInterface v0.7.9
[ffbed154] + DocStringExtensions v0.9.5
[4e289a0a] + EnumX v1.0.5
[e2ba6199] + ExprTools v0.1.10
[55351af7] + ExproniconLite v0.10.14
[9aa1b823] + FastClosures v0.3.2
[1a297f60] + FillArrays v1.14.0
[069b7b12] + FunctionWrappers v1.1.3
[77dc65aa] + FunctionWrappersWrappers v0.1.3
[46192b85] + GPUArraysCore v0.2.0
[3587e190] + InverseFunctions v0.1.17
[82899510] + IteratorInterfaceExtensions v1.0.0
[692b3bcd] + JLLWrappers v1.7.1
[ae98c720] + Jieko v0.2.1
[1d6d02ad] + LeftChildRightSiblingTrees v0.2.1
[e6f89c97] + LoggingExtras v1.2.0
[1914dd2f] + MacroTools v0.5.16
[2e0e35c7] + Moshi v0.3.7
[3933049c] + MultistartOptimization v0.3.1
[76087f3c] + NLopt v1.2.1
[7f7a1694] + Optimization v5.0.0
[bca83a33] + OptimizationBase v3.2.0
[e4316d97] + OptimizationMultistartOptimization v0.3.2
[4e6fcdb7] + OptimizationNLopt v0.3.5
[90014a1f] + PDMats v0.11.35
[d236fae5] + PreallocationTools v0.4.34
⌅ [aea7be01] + PrecompileTools v1.2.1
[21216c6a] + Preferences v1.5.0
[33c8b6b6] + ProgressLogging v0.1.5
[92933f4c] + ProgressMeter v1.11.0
[3cdcf5f2] + RecipesBase v1.3.4
[731186ca] + RecursiveArrayTools v3.37.1
[189a3867] + Reexport v1.2.2
[ae029012] + Requires v1.3.1
[7e49a35a] + RuntimeGeneratedFunctions v0.5.15
[0bca4576] + SciMLBase v2.121.1
[c0aeaf25] + SciMLOperators v1.9.0
[53ae85a6] + SciMLStructures v1.7.0
[ed01d8cd] + Sobol v1.5.0
[9f842d2f] + SparseConnectivityTracer v1.1.1
[0a514795] + SparseMatrixColorings v0.4.22
[1e83bf80] + StaticArraysCore v1.4.3
[10745b16] + Statistics v1.11.1
[2efcf032] + SymbolicIndexingInterface v0.3.45
[5d786b92] + TerminalLoggers v0.1.7
[079eb43e] + NLopt_jll v2.10.0+0
[56f22d72] + Artifacts v1.11.0
[2a0f44e3] + Base64 v1.11.0
[ade2ca70] + Dates v1.11.0
[8ba89e20] + Distributed v1.11.0
[8f399da3] + Libdl v1.11.0
[37e2e46d] + LinearAlgebra v1.11.0
[56ddb016] + Logging v1.11.0
[d6f4376e] + Markdown v1.11.0
[a63ad114] + Mmap v1.11.0
[de0858da] + Printf v1.11.0
[9a3f8284] + Random v1.11.0
[ea8e919c] + SHA v0.7.0
[9e88b42a] + Serialization v1.11.0
[6462fe0b] + Sockets v1.11.0
[2f01184e] + SparseArrays v1.11.0
[4607b0f0] + SuiteSparse
[fa267f1f] + TOML v1.0.3
[cf7118a7] + UUIDs v1.11.0
[4ec0a83e] + Unicode v1.11.0
[e66e0078] + CompilerSupportLibraries_jll v1.1.1+0
[4536629a] + OpenBLAS_jll v0.3.27+1
[bea87d4a] + SuiteSparse_jll v7.7.0+0
[8e850b90] + libblastrampoline_jll v5.11.0+0
Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated -m`
julia> using Optimization, OptimizationMultistartOptimization, OptimizationNLopt
julia> rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
rosenbrock (generic function with 1 method)
julia> x0 = zeros(2)
2-element Vector{Float64}:
0.0
0.0
julia> p = [1.0, 100.0]
2-element Vector{Float64}:
1.0
100.0
julia> f = OptimizationFunction(rosenbrock)
OptimizationFunction{true, SciMLBase.NoAD, typeof(rosenbrock), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}(rosenbrock, SciMLBase.NoAD(), nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, SciMLBase.DEFAULT_OBSERVED_NO_TIME, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing)
julia> prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0])
OptimizationProblem. In-place: true
u0: 2-element Vector{Float64}:
0.0
0.0
julia> sol = solve(prob, MultistartOptimization.TikTak(100), NLopt.LD_LBFGS())
ERROR: MethodError: objects of type Nothing are not callable
The object of type `Nothing` exists, but no method is defined for this combination of argument types when trying to treat it as a callable object.
Stacktrace:
[1] #8
@ ~/.julia/packages/OptimizationNLopt/J6Ylt/src/OptimizationNLopt.jl:181 [inlined]
[2] nlopt_callback_wrapper(n::UInt32, p_x::Ptr{Float64}, p_grad::Ptr{Float64}, d::NLopt.Callback_Data{OptimizationNLopt.var"#8#18"{OptimizationCache{…}, OptimizationNLopt.var"#7#17"{…}}, Opt})
@ NLopt ~/.julia/packages/NLopt/wMgqN/src/NLopt.jl:522
[3] nlopt_optimize
@ ~/.julia/packages/NLopt/wMgqN/src/libnlopt.jl:182 [inlined]
[4] optimize!(o::Opt, x::Vector{Float64})
@ NLopt ~/.julia/packages/NLopt/wMgqN/src/NLopt.jl:849
[5] optimize
@ ~/.julia/packages/NLopt/wMgqN/src/NLopt.jl:863 [inlined]
[6] __solve(cache::OptimizationCache{OptimizationFunction{…}, OptimizationBase.ReInitCache{…}, Vector{…}, Vector{…}, Nothing, Nothing, Nothing, Algorithm, Bool, OptimizationNLopt.var"#2#4", Nothing})
@ OptimizationNLopt ~/.julia/packages/OptimizationNLopt/J6Ylt/src/OptimizationNLopt.jl:263
[7] solve!
@ ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:205 [inlined]
[8] #solve#3
@ ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:96 [inlined]
[9] solve(::OptimizationProblem{true, OptimizationFunction{…}, Vector{…}, Vector{…}, Vector{…}, Vector{…}, Nothing, Nothing, Nothing, Nothing, @Kwargs{}}, ::Algorithm)
@ OptimizationBase ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:93
[10] (::OptimizationMultistartOptimization.var"#3#5"{…})(pb::MinimizationProblem{…}, θ0::Vector{…}, prob::OptimizationProblem{…})
@ OptimizationMultistartOptimization ~/.julia/packages/OptimizationMultistartOptimization/9IWKk/src/OptimizationMultistartOptimization.jl:64
[11] (::OptimizationMultistartOptimization.var"#local_optimiser#6"{…})(pb::MinimizationProblem{…}, θ0::Vector{…})
@ OptimizationMultistartOptimization ~/.julia/packages/OptimizationMultistartOptimization/9IWKk/src/OptimizationMultistartOptimization.jl:68
[12] local_minimization
@ ~/.julia/packages/MultistartOptimization/7t00L/src/generic_api.jl:57 [inlined]
[13] (::MultistartOptimization.var"#_step#14"{…})(visited_minimum::@NamedTuple{…}, ::Tuple{…})
@ MultistartOptimization ~/.julia/packages/MultistartOptimization/7t00L/src/tiktak.jl:128
[14] BottomRF
@ ./reduce.jl:86 [inlined]
[15] _foldl_impl(op::Base.BottomRF{MultistartOptimization.var"#_step#14"{…}}, init::@NamedTuple{location::Vector{…}, value::Float64}, itr::Base.Iterators.Enumerate{Base.Iterators.Drop{…}})
@ Base ./reduce.jl:58
[16] foldl_impl
@ ./reduce.jl:48 [inlined]
[17] mapfoldl_impl(f::typeof(identity), op::MultistartOptimization.var"#_step#14"{…}, nt::@NamedTuple{…}, itr::Base.Iterators.Enumerate{…})
@ Base ./reduce.jl:44
[18] mapfoldl(f::Function, op::Function, itr::Base.Iterators.Enumerate{Base.Iterators.Drop{Vector{Any}}}; init::@NamedTuple{location::Vector{Float64}, value::Float64})
@ Base ./reduce.jl:175
[19] mapfoldl
@ ./reduce.jl:175 [inlined]
[20] #foldl#336
@ ./reduce.jl:198 [inlined]
[21] multistart_minimization(multistart_method::TikTak, local_method::Function, minimization_problem::MinimizationProblem{…}; use_threads::Bool, prepend_points::Vector{…})
@ MultistartOptimization ~/.julia/packages/MultistartOptimization/7t00L/src/tiktak.jl:132
[22] __solve(cache::OptimizationCache{OptimizationFunction{…}, OptimizationBase.ReInitCache{…}, Vector{…}, Vector{…}, Nothing, Nothing, Nothing, TikTak, Bool, OptimizationBase.NullCallback, Nothing})
@ OptimizationMultistartOptimization ~/.julia/packages/OptimizationMultistartOptimization/9IWKk/src/OptimizationMultistartOptimization.jl:71
[23] solve!
@ ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:205 [inlined]
[24] #solve#3
@ ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:96 [inlined]
[25] solve(prob::OptimizationProblem{true, OptimizationFunction{…}, Vector{…}, Vector{…}, Vector{…}, Vector{…}, Nothing, Nothing, Nothing, Nothing, @Kwargs{}}, alg::TikTak, args::Algorithm)
@ OptimizationBase ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:93
[26] top-level scope
@ REPL[9]:1
[27] eval
@ ./boot.jl:430 [inlined]
[28] eval_user_input(ast::Any, backend::REPL.REPLBackend, mod::Module)
@ REPL ~/julia-1.11.5-home/share/julia/stdlib/v1.11/REPL/src/REPL.jl:261
[29] repl_backend_loop(backend::REPL.REPLBackend, get_module::Function)
@ REPL ~/julia-1.11.5-home/share/julia/stdlib/v1.11/REPL/src/REPL.jl:368
[30] start_repl_backend(backend::REPL.REPLBackend, consumer::Any; get_module::Function)
@ REPL ~/julia-1.11.5-home/share/julia/stdlib/v1.11/REPL/src/REPL.jl:343
[31] run_repl(repl::REPL.AbstractREPL, consumer::Any; backend_on_current_task::Bool, backend::Any)
@ REPL ~/julia-1.11.5-home/share/julia/stdlib/v1.11/REPL/src/REPL.jl:500
[32] run_repl(repl::REPL.AbstractREPL, consumer::Any)
@ REPL ~/julia-1.11.5-home/share/julia/stdlib/v1.11/REPL/src/REPL.jl:486
[33] (::Base.var"#1150#1152"{Bool, Symbol, Bool})(REPL::Module)
@ Base ./client.jl:446
[34] #invokelatest#2
@ ./essentials.jl:1055 [inlined]
[35] invokelatest
@ ./essentials.jl:1052 [inlined]
[36] run_main_repl(interactive::Bool, quiet::Bool, banner::Symbol, history_file::Bool, color_set::Bool)
@ Base ./client.jl:430
[37] repl_main
@ ./client.jl:567 [inlined]
[38] _start()
@ Base ./client.jl:541
Stacktrace:
[1] optimize!(o::Opt, x::Vector{Float64})
@ NLopt ~/.julia/packages/NLopt/wMgqN/src/NLopt.jl:856
[2] optimize
@ ~/.julia/packages/NLopt/wMgqN/src/NLopt.jl:863 [inlined]
[3] __solve(cache::OptimizationCache{OptimizationFunction{…}, OptimizationBase.ReInitCache{…}, Vector{…}, Vector{…}, Nothing, Nothing, Nothing, Algorithm, Bool, OptimizationNLopt.var"#2#4", Nothing})
@ OptimizationNLopt ~/.julia/packages/OptimizationNLopt/J6Ylt/src/OptimizationNLopt.jl:263
[4] solve!
@ ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:205 [inlined]
[5] #solve#3
@ ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:96 [inlined]
[6] solve(::OptimizationProblem{true, OptimizationFunction{…}, Vector{…}, Vector{…}, Vector{…}, Vector{…}, Nothing, Nothing, Nothing, Nothing, @Kwargs{}}, ::Algorithm)
@ OptimizationBase ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:93
[7] (::OptimizationMultistartOptimization.var"#3#5"{…})(pb::MinimizationProblem{…}, θ0::Vector{…}, prob::OptimizationProblem{…})
@ OptimizationMultistartOptimization ~/.julia/packages/OptimizationMultistartOptimization/9IWKk/src/OptimizationMultistartOptimization.jl:64
[8] (::OptimizationMultistartOptimization.var"#local_optimiser#6"{…})(pb::MinimizationProblem{…}, θ0::Vector{…})
@ OptimizationMultistartOptimization ~/.julia/packages/OptimizationMultistartOptimization/9IWKk/src/OptimizationMultistartOptimization.jl:68
[9] local_minimization
@ ~/.julia/packages/MultistartOptimization/7t00L/src/generic_api.jl:57 [inlined]
[10] (::MultistartOptimization.var"#_step#14"{…})(visited_minimum::@NamedTuple{…}, ::Tuple{…})
@ MultistartOptimization ~/.julia/packages/MultistartOptimization/7t00L/src/tiktak.jl:128
[11] BottomRF
@ ./reduce.jl:86 [inlined]
[12] _foldl_impl(op::Base.BottomRF{MultistartOptimization.var"#_step#14"{…}}, init::@NamedTuple{location::Vector{…}, value::Float64}, itr::Base.Iterators.Enumerate{Base.Iterators.Drop{…}})
@ Base ./reduce.jl:58
[13] foldl_impl
@ ./reduce.jl:48 [inlined]
[14] mapfoldl_impl(f::typeof(identity), op::MultistartOptimization.var"#_step#14"{…}, nt::@NamedTuple{…}, itr::Base.Iterators.Enumerate{…})
@ Base ./reduce.jl:44
[15] mapfoldl(f::Function, op::Function, itr::Base.Iterators.Enumerate{Base.Iterators.Drop{Vector{Any}}}; init::@NamedTuple{location::Vector{Float64}, value::Float64})
@ Base ./reduce.jl:175
[16] mapfoldl
@ ./reduce.jl:175 [inlined]
[17] #foldl#336
@ ./reduce.jl:198 [inlined]
[18] multistart_minimization(multistart_method::TikTak, local_method::Function, minimization_problem::MinimizationProblem{…}; use_threads::Bool, prepend_points::Vector{…})
@ MultistartOptimization ~/.julia/packages/MultistartOptimization/7t00L/src/tiktak.jl:132
[19] __solve(cache::OptimizationCache{OptimizationFunction{…}, OptimizationBase.ReInitCache{…}, Vector{…}, Vector{…}, Nothing, Nothing, Nothing, TikTak, Bool, OptimizationBase.NullCallback, Nothing})
@ OptimizationMultistartOptimization ~/.julia/packages/OptimizationMultistartOptimization/9IWKk/src/OptimizationMultistartOptimization.jl:71
[20] solve!
@ ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:205 [inlined]
[21] #solve#3
@ ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:96 [inlined]
[22] solve(prob::OptimizationProblem{true, OptimizationFunction{…}, Vector{…}, Vector{…}, Vector{…}, Vector{…}, Nothing, Nothing, Nothing, Nothing, @Kwargs{}}, alg::TikTak, args::Algorithm)
@ OptimizationBase ~/.julia/packages/OptimizationBase/uENXp/src/solve.jl:93
[23] top-level scope
@ REPL[9]:1
Some type information was truncated. Use `show(err)` to see complete types.
why is this? and how to solve it? i.e. how to use LBFGS or gradient methods (with bounds like m in and max set for the optimization problem) with multistart?