I want to use the LBFGS() method in optim package to optimize. Usually LBFGS will do many linearSearch, but I only want it to do one iteration. How to do that?
Just set the iterations option to 1, for example:
result = optimize(rosenbrock, zeros(2), LBFGS(), Optim.Options(iterations=1))
Hm, This is not what I mean. Maybe I didnāt go into enough detail
This is a minimal working example
using Optim
function f(x)
temp=(1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
println("f=",temp)
return temp
end
result = optimize(f, [1.0 1.0], LBFGS(), Optim.Options(show_trace=true,extended_trace=true,iterations=1))
The output is there:
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
Iter Function value Gradient norm
0 0.000000e+00 1.466736e-08
* Current step size: 1.0
* time: 0.0
* g(x): [1.4667356107373247e-8 -1.1102239280930583e-14]
* x: [1.0 1.0]
f=1.4633022753087148e-8
f=1.4775308449140644e-8
f=3.702466146048789e-9
f=3.6314121143532064e-9
f=8.626772924165988e-14
f=1.470407997780277e-8
f=1.4704079977879413e-8
f=3.666897160566489e-9
f=3.6668085646459185e-9
f=1.3412189577348004e-19
f=1.4704124387146717e-8
f=1.4704035569409278e-8
f=3.6668750114521403e-9
f=3.666830713559084e-9
f=3.353047394337001e-20
f=1.4704146591574923e-8
f=1.4704013365333815e-8
f=3.666863936920114e-9
f=3.666841788040815e-9
f=8.382618485842503e-21
f=1.4704157694334489e-8
f=1.4704002263167913e-8
f=3.666858399660388e-9
f=3.666847325287967e-9
f=2.0956546214606257e-21
f=1.4704163245715843e-8
f=1.4703996711816946e-8
f=3.6668556308976383e-9
f=3.6668500940475728e-9
f=5.238627689649373e-22
f=1.470416602032856e-8
f=1.470399393668103e-8
f=3.666854246785573e-9
f=3.666851478158853e-9
f=1.3099113605918222e-22
f=1.4704167408174191e-8
f=1.4703992548843586e-8
f=3.66685355459518e-9
f=3.666852170349049e-9
f=3.2747784014795554e-23
f=1.4704168102636214e-8
f=1.4703991854924885e-8
f=3.6668532085000085e-9
f=3.666852516444172e-9
f=8.186946003698888e-24
f=1.4704168449867228e-8
f=1.4703991507695954e-8
f=3.6668530353179703e-9
f=3.6668526896261973e-9
f=2.0435571048983057e-24
f=1.4704168622404382e-8
f=1.4703991334620665e-8
f=3.666852948995869e-9
f=3.6668527759482952e-9
f=5.124795920761108e-25
f=1.4704168709751314e-8
f=1.4703991247813434e-8
f=3.6668529057003606e-9
f=3.666852819243803e-9
f=1.281198980190277e-25
f=1.4704168752885602e-8
f=1.4703991204140229e-8
f=3.666852883918149e-9
f=3.6668528410260152e-9
f=3.1633322299362573e-26
f=1.4704168774452748e-8
f=1.4703991182573215e-8
f=3.6668528731615006e-9
f=3.666852851782663e-9
f=7.908330574840643e-27
f=1.470416878523632e-8
f=1.4703991171789708e-8
f=3.666852867783177e-9
f=3.6668528571609866e-9
f=1.977082643710161e-27
f=1.4704168790628105e-8
f=1.4703991166397954e-8
f=3.6668528650940147e-9
f=3.666852859850149e-9
f=4.942706609275402e-28
f=1.4704168793863177e-8
f=1.4703991163702078e-8
f=3.666852863749434e-9
f=3.6668528611947296e-9
f=1.2356766523188505e-28
f=1.4704168794941536e-8
f=1.4703991162623728e-8
f=3.6668528632116014e-9
f=3.6668528617325624e-9
f=4.448435948347862e-29
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
f=1.4704168796019891e-8
f=1.4703991161545379e-8
f=3.6668528626737686e-9
f=3.666852862270395e-9
f=4.942706609275402e-30
f=1.4704168796019891e-8
f=1.47039911610062e-8
f=3.666852862404853e-9
f=3.6668528625393112e-9
f=0.0
1 0.000000e+00 1.466736e-08
* Current step size: 3.784673312960098e-9
* time: 1.1170001029968262
* g(x): [1.4667356107373247e-8 -1.1102239280930583e-14]
* x: [1.0 1.0]
* Status: success
* Candidate solution
Final objective value: 0.000000e+00
* Found with
Algorithm: L-BFGS
* Convergence measures
|x - x'| = 0.00e+00 ā¤ 0.0e+00
|x - x'|/|x'| = 0.00e+00 ā¤ 0.0e+00
|f(x) - f(x')| = 0.00e+00 ā¤ 0.0e+00
|f(x) - f(x')|/|f(x')| = NaN ā° 0.0e+00
|g(x)| = 1.47e-08 ā° 1.0e-08
* Work counters
Seconds run: 1 (vs limit Inf)
Iterations: 1
f(x) calls: 49
āf(x) calls: 49
From the above results, we can see that the number of optimization iterations is 1, but many linear iterations are done .
When my function is very complex, multiple linear searches will call the function many times, but the optimization effect is not significant . So the number of iterations I want to control is the number of iterations of linear search, and the output that I want is going to look like this :
Iter Function value Gradient norm
0 0.000000e+00 1.466736e-08
* Current step size: 1.0
* time: 0.0
* g(x): [1.4667356107373247e-8 -1.1102239280930583e-14]
* x: [1.0 1.0]
f=1.4633022753087148e-8
1 0.000000e+00 1.466736e-08
* Current step size: 3.784673312960098e-9
* time: 1.1170001029968262
* g(x): [1.4667356107373247e-8 -1.1102239280930583e-14]
* x: [1.0 1.0]
Linesearch routines are in a separate package LineSearches.jl
, LBFGS takes them as a keyword.
linesearch = LineSearches.BackTracking(iterations=10)
LBFGS(; linesearch=linesearch)
Yes.
So you can do as @IlyaOrson says above and set it to 1. Or use LineSearches.Static()
. I donāt think you want to, but you can
This part I donāt understand. If you set it to 1, you you probably get āCurrent step sizeā of 1 or 1/2 depending on what you do.
Great. Thank you for your reply. Itās very helpful for me