I’m getting an error while saving / loading in BSON:
res1_node = DiffEqFlux.sciml_train(loss, p, ADAM(0.01), cb=callback, maxiters = 500)
using BSON: @save
@save "res1_node" res1_node
when I try to load:
using BSON: @load
@load "res1_node" res1_node
I get the error:
ERROR: UndefVarError: Optimization not defined
Stacktrace:
[1] (::BSON.var"#31#32")(m::Module, f::String)
@ BSON C:\Users\44745\.julia\packages\BSON\rOaki\src\extensions.jl:21
[2] BottomRF
@ .\reduce.jl:81 [inlined]
[3] _foldl_impl(op::Base.BottomRF{BSON.var"#31#32"}, init::Module, itr::Vector{Any})
@ Base .\reduce.jl:58
[4] foldl_impl
@ .\reduce.jl:48 [inlined]
[5] mapfoldl_impl
@ .\reduce.jl:44 [inlined]
[6] _mapreduce_dim
@ .\reducedim.jl:327 [inlined]
[7] #mapreduce#731
@ .\reducedim.jl:322 [inlined]
[8] #reduce#733
@ .\reducedim.jl:371 [inlined]
[9] resolve(fs::Vector{Any}, init::Module)
@ BSON C:\Users\44745\.julia\packages\BSON\rOaki\src\extensions.jl:21
[10] (::BSON.var"#35#36")(d::Dict{Symbol, Any}, init::Module)
@ BSON C:\Users\44745\.julia\packages\BSON\rOaki\src\extensions.jl:64
[11] _raise_recursive(d::Dict{Symbol, Any}, cache::IdDict{Any, Any}, init::Module)
@ BSON C:\Users\44745\.julia\packages\BSON\rOaki\src\read.jl:80
[12] raise_recursive(d::Dict{Symbol, Any}, cache::IdDict{Any, Any}, init::Module)
@ BSON C:\Users\44745\.julia\packages\BSON\rOaki\src\read.jl:93
[13] (::BSON.var"#23#24"{IdDict{Any, Any}, Module})(x::Dict{Symbol, Any})
@ BSON C:\Users\44745\.julia\packages\BSON\rOaki\src\read.jl:98
[14] applychildren!(f::BSON.var"#23#24"{IdDict{Any, Any}, Module}, x::Vector{Any})
@ BSON C:\Users\44745\.julia\packages\BSON\rOaki\src\BSON.jl:26
[15] raise_recursive
@ C:\Users\44745\.julia\packages\BSON\rOaki\src\read.jl:98 [inlined]
[16] (::BSON.var"#17#20"{IdDict{Any, Any}, Module})(x::Vector{Any})
@ BSON C:\Users\44745\.julia\packages\BSON\rOaki\src\read.jl:80
[17] applychildren!(f::BSON.var"#17#20"{IdDict{Any, Any}, Module}, x::Dict{Symbol, Any})
@ BSON C:\Users\44745\.julia\packages\BSON\rOaki\src\BSON.jl:19
I tried explicitly defining the optimizer:
opt1 = ADAM(0.01)
@save "res1_nodeopt" res1_node opt1
@load "res1_nodeopt" res1_node opt1
but get the same error…
Does anyone know how to fix this?
Thanks