TensorFlow.jl, set learning rate as Variable


I’d like to control the learning rate of the optimizer in TensorFlow, so in different stages of the training (actually fixing different parts of the network) I want to be able to set different learning rates.

From here I understand that I should make the learning rate a Variable, and pass this to the optimizer as argument. However, I get a Julia error
ERROR: LoadError: MethodError: Cannot `convert` an object of type TensorFlow.Tensor{Float64} to an object of type Float64, I think because the variable is always a tensor (even one with shape []), and the wrapper expects a Float64 in the place of the learning rate parameter.

Any thoughts on this?