Can I reconstruct a network using Flux.params in Flux.jl?

I think the recommended way to save a model is by using BSON (Saving & Loading · Flux) which one line of code and works well. It saves the entire neural network. The parameter object doesn’t usually have all the information needed to restore a neural network, like the activation functions.

If you really want to do it via parameters, you can use Flux.destructure:

parameters, re = Flux.destructure(model)

model_1 = re(parameters)

Where parameters is a flat vector of all parameter values and re is a function that rebuilds the model from the parameter vector.

Overall, I’d recommend BSON to reload a model.