Flux -- decompose layers + parameters?

A silly question… I used simple Flux networks 2 years ago, and I notice there have been some changes… Suppose I have a network named md (…model…):

  1. In the old system, I could pick out layer no. i by md.layers[i].
    → Am I right in that this has been replaced by md[i]?
  2. In the old system, I could pick bias vector b and weight matrix W by md.b.data and md.W.data.
    → Am I right in that now, I do par = params(md) and then par[i] for i in 1:2:end gives the bias vectors b, while par[i] for i in 2:2:end gives the weight matrices W?
  3. In the old version of Flux, it was a little tricky to specify the data type of the bias vectors and weight matrices.
    → Is this simpler, now?

[I ask because I didn’t find documentation for `params`, etc. in the documentation… perhaps it would have been simpler if had available a PDF file or an ePUB file which I could search in. Anyyway, I’m planning a Julia intro seminar for my colleagues, and want to show off Flux.]

Nope, you can still do md.layers[i] if md is a Chain. md[i] is definitely preferred though.

Because Flux no longer uses Tracker, you can just reference md.b and md.W directly. params is mostly useful as a way around listing every single parameter array in a model when calling gradient.

Do you mean e.g. Float64 vs Float32? Flux now has helper functions that will allow you to convert a layer or an entire model’s weights between both (f64 and f32 respectively).


What are these helper functions? Are they discussed in the documentation, or only in some tutorials?

[Would they be something like f64(md) or convert(f64, md), or something?]

Not directly. GPU Support · Flux and Advanced Model Building · Flux talk about the underlying functionality, but I assume the helpers weren’t documented because a) they’re quite simple, and b) most people never bothered to use them because the default of Float32 works well enough for them. If you wouldn’t mind filing an issue (or preferably, a PR) for the docs, we can add a note about both.

1 Like