Saving/loading Flux models with Julia 1.8.x?

I have been using BSON.jl for saving and loading Flux models, but with Julia 1.8, there is an issue with BSON that appears to be a sticking point (`setfield!: const field .names of type TypeName cannot be changed` on Julia 1.8 · Issue #109 · JuliaIO/BSON.jl · GitHub) Any new recommendations for saving and loading?

There was a discussion about this on slack. Some people suggested
GitHub - beacon-biosignals/LegolasFlux.jl: Save Flux model weights in Legolas-powered Arrow tables
GitHub - ancapdev/LightBSON.jl: High performance encoding and decoding of BSON data in Julia

Kyle Daruwala suggested this
I would also recommend serializing on the state via fmapstructure instead of the model itself. The types need to be resolvable in the current namespace for BSON to work anyways.

and of course serialization, which now should be backward compatible.

These are copy pasted from slack answers. I have not tried them, I usually serialize, since my models are short lived.

1 Like

Thanks for the links. serialize/deserialize doesn’t seem to work for me, I can use it to save the model, but I get problems when trying to load it back:

julia> using BSON, Flux,CUDA, Serialization

julia> BSON.@load "models/MA2/mcx2_(n-100).bson" best_model

julia> serialize("junk", best_model)

julia> 
❯ julia --proj --quiet
julia> using BSON, Flux,CUDA, Serialization

julia> deserialize("junk")
ERROR: UndefVarError: #233#234 not defined
Stacktrace:

The problem seems to be in the part of the model at the end of this excerpt:

Chain(
  Chain(
    Chain(
      Parallel(
        +,
        Chain(
          Conv((1, 32), 1 => 32, pad=(0, 0, 16, 15)),  # 1_056 parameters
          BatchNorm(32, relu),          # 64 parameters, plus 64
          Conv((1, 32), 32 => 32, pad=(0, 0, 16, 15)),  # 32_800 parameters
          BatchNorm(32, relu),          # 64 parameters, plus 64
        ),
        Conv((1, 1), 1 => 32),          # 64 parameters
      ),
      BSON.__deserialized_types__.var"#233#234"(),

1 Like

Do you have some anonymous function on the end of the model? In that slack channel they have written than BSON is broken with 1.8.

What is BSON.__deserialized_types__.var"#233#234"() in model definition?

I think that must come from originally saving the model using BSON. For the future, I will try eliminating BSON entirely, for the original save, and try out serialization again. Thanks for the mention of the slack discussion, I found it.