I am trying to train an MLP network with 1 output where the outputs are in various groups (as indicated by a stratification variable). I wish to normalise outputs within groups (as in softmax) before applying the loss function.
I had written a custom loss function to do this before Flux switched from Tracker to Zygote. My Network ran just fine previously, but now fails with an ‘Need an adjoint for constructor’ error
Can anyone help me figure out how to specify the loss function ?
In my code, I pass the stratification variable with the target output, so the composite output matrix y is 2 rows and n columns, so
Data matrix x has m rows and n columns
Target matrix y has 2 rows and n columns
m(x) is my model (from ‘Chain’, with single exponential output activation)
Here is my loss function
p=renstrat(y[2,:],m(x)) # renstrat() is a fast external normalisation routine
This is a very basic example, which worked fine with theTracker version of Flux but not for the Zygote one.
I even tried to manually replace the loss function with its
2-term Taylor series expansion (i.e., a simple quadratic function(!!)),
but got the same error
Or I could supply the derivatives manually if I knew how to specify?
Does anyone have any ideas?
Thanks for any help.
On another point - it seems that many people have had issues with writing Custom Loss functions for ML with the ‘Zygote’ version of Flux [has anyone ever heard of someone getting it to work?]
Custom Loss functions are essential for Research and even just for advancement of basic applications.
Is there any chance of bringing back Tracker, at least as an option (until Zygote becomes fully operational in this regard)?