It would be useful to also take this post as an opportunity to link to the related new question: Nesting the GLM.jl in the objective function when formulating the ReverseDiff gradient generates StackOverflowError
That question is a better description for the end-goal as @mcreel suggested. Especially, it also has a clue (yes, it is a clue, because readers need to undo the transformations done to protect the ‘proprietary’ application), to the weights generation.
The weights in that question are all scalar multiples of each other, which essentially means predict_data
in that question is independent of the para[3]
scaling factor and only depends on data[:,:w]
. It also means inference on para[3]
is impossible (it’s statistically unidentifiable).