Has it been any update or progress in this line? I recently encountered a similar problem that I posted in Nested and different AD methods altogether: How to add AD calculations inside my loss function when using neural differential equations? that I am trying to make work. Thanks!
facusapienza
11
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Flux loss: Gradient wrt input leads to empty gradient wrt parameters or to "can't differentiate foreigncall" | 3 | 605 | April 8, 2022 | |
| Is it possible perform reverse mode differentiation (Flux.jl with Zygote.jl) of a forward mode differentiation result (e.g. ForwardDiff)? | 3 | 1481 | March 10, 2020 | |
| Using Flux: gradient on DifferentialEquations: solve results in an error | 11 | 1057 | May 8, 2023 | |
| Gradient error in Flux model inputs | 5 | 1367 | January 13, 2021 | |
| ForwardDiffSensitivity() and TrackerAdjoint() give different answers | 3 | 823 | July 18, 2020 |