Adjoint for Base.TwicePrecision


I would like to define an adjoint for Base.TwicePrecision though I was wondering if it can be done given the following test

julia> using Zygote

julia> gradient(x -> Float64(sum(map(i -> i*Base.TwicePrecision(x), 1:10))), 1.0)
ERROR: Non-differentiable function Core.Intrinsics.bitcast

Is it possible to work around a non-differentiable function?

The idea for this question came from trying to differentiate a StepRangeLen, and after defining adjoints for StepRangeLen and Base.TwicePrecision I ran into the same error.


If you’re fine with a little piracy then a temporary measure would be to mark it as @non_differentiable. Longer-term I think it’s worth asking on the ChainRules.jl tracker if they’d be willing to accept a rule for TwicePrecision.