Hello,
I would like to define an adjoint for Base.TwicePrecision though I was wondering if it can be done given the following test
julia> using Zygote
julia> gradient(x -> Float64(sum(map(i -> i*Base.TwicePrecision(x), 1:10))), 1.0)
ERROR: Non-differentiable function Core.Intrinsics.bitcast
Is it possible to work around a non-differentiable function?
The idea for this question came from trying to differentiate a StepRangeLen, and after defining adjoints for StepRangeLen and Base.TwicePrecision I ran into the same error.
Thanks