Not currently on a computer, but as far as I remember, the output is not restricted to [0,1] by default.
You can actually just take a look at the model architecture in the REPL by just doing
julia> cpumodel
Take a look at the last lines to see if the output of the last transformation would be restricted to [0,1] (sigmoid/hardsigmoid) or not. I guess it’s just relu +/- batchnorm at the end.
That said, you can simply add a missing sigmoid/hardsigmoid at the end to get what you want.