Hello,
I am using the following two lines and OptimizationOptimJL is installed.
using OptimizationOptimJL;
optimized_sol_nn2 = Optimization.solve(optprob, optimized_sol_nn.minimizer, Optim.BFGS(initial_stepnorm=0.001), maxiters = 1000, allow_f_increases = true)
#optimized_sol_nn2 = DiffEqFlux.sciml_train(p -> cost_adjoint_nn(p, 0.08), optimized_sol_nn.minimizer, BFGS(initial_stepnorm=0.001), maxiters = 1000, allow_f_increases = true)
However, I am getting the following error:
Optimization algorithm not found. Either the chosen algorithm is not a valid solver choice for the
OptimizationProblem
, or the Optimization solver library is not loaded. Make sure that you have loaded an appropriate Optimization.jl solver library, for example,solve(prob,Optim.BFGS())
requiresusing OptimizationOptimJL
andsolve(prob,Adam())
requiresusing OptimizationOptimisers
. For more information, see the Optimization.jl documentation: https://docs.sciml.ai/Optimization/stable/. Chosen Optimizer:
I checked the documentation. I am using the correct package. However, I did not understand why I am getting error for Optim.BFGS(initial_stepnorm=0.001)