Can you autodiff through it? A similar idea to what youâre doing now would be to compare ForwardDiff.derivative(f, x+eps(x)) and ForwardDiff.derivative(f, x-eps(x)) or something like that.
You need a whole lot of frequencies to approximate a discontinuity, so it wouldnât be a peak. Explaining it as a sudden change in frequency, whatever that means, seems questionable at best. Which frequency would suddenly change to which if you have f(x) = sign(x)?
Fair points, thatâs why I said Iâm not an expert on FFT. Youâre right that a sudden change of slope is not a change of frequency. I guess it is more like an impulse, which contains all frequencies. Thanks for the clarification.
I guess I was thinking along the lines of those artifacts (peaks) that occur when you Fourier series a square wave function. But I guess derivative is a better method. Like the derivate of the Heaviside step function can be understood as the dirac delta function at the step.
EDIT: I was just trying to brainstorm but I guess it was not that helpfulâŚ
Also, in order to calculate an FFT (DFT) youâd need to sample the functionâs outputs at a number of points in the domain around the particular value. A discontinuity anywhere in that sampled region, not just at the desired point, would probably be indistinguishable.
Note that you wouldnât use a DFT or Fourier series directly, because that implies periodic boundaries (so a generic non-periodic function will effectively always have âdiscontinuitiesâ at the endpoints from the perspective of the DFT). If you have a function on an arbitrary finite interval x \in [a,b] and you want to detect discontinuities and other singularities automatically (e.g. at unknown positions), you could expand in Chebyshev polynomials (e.g. via ApproxFun.jl or FastChebInterp.jl ⌠these use FFTs internally because they are related to a DFT by a change of variables), and look at the convergence rate of the Chebyshev coefficients c_n â if it is converging as only \sim 1/n asymptotically, that indicates a discontinuity somewhere in the interior (a,b) of the domain. (More generally, slower-than-exponential asymptotic convergence indicates some non-analyticity.)
If you want to locate unknown discontinuities/singularities in an interval [a,b] (if any) there are various recursive search algorithms. e.g. the algorithm used by Chebfun with the splitting on option is described in the article âPiecewise smooth chebfunsâ (2010)
Their findjump code (free/open-source under 3-clause BSD is pretty short and wouldnât be too hard to re-implement in Julia.
Oh geez, youâre right. I read that too quickly and thought Tamas was asking for a test of differentiability (where the point was to compare the two one-sided derivatives), not continuity. Sorry guys.
I thought about this but could not figure out a reasonable cutoff. Eg consider
function f(x; A)
z = x - 1
z > 0 ? A * z : 2 * A * z
end
which is definitely continuous at 0, but the difference between f(1\pm\epsilon) will depend on A, which can be arbitrarily large.
I should have emphasized that the locations of the possible discontinuities are known. No need to search for them.
Thanks for all the answers. I will probably go with Richardson extrapolation as it is fairly straightforward, and the cost does not matter (again, this is for CI).
The way to go is more symbolic. Like in AD a tape of operations of a function on a âdual variable typeâ will collect the points of potential discontinuities and their left/right/middle values.
TL;DR copy-paste AD logic and apply to this problem, but prevfloat/nextfloat is a good approx.
If there is a piecewise-linear functions implementation in some package, it could also be a guide (Iâm not familiar with one).
It isnât trivial. And Iâm not aware of a paper. The idea would go, by extending a Value into a (Value, Dir) where Dir is a âdirectionâ, either + or - or 0. Most calculation would go through to work on Value, except expressions like <,<=,==,> which would virtually add an infinitismal Dir. Thus (3, +) <= 3 would be false.
With this setup, finding continuity of f(3) would be seeing if f(3+) == f(3) == f(3-). Some other discontinuous functions would also need to be defined, such as round, sign, etcâŚ
ADDED:
This âetcâŚâ above does a lot of work. For example -(3,+) would be (-3,-).
I would wager that interval arithmetic (IEEE 1788) is the correct tool for this job. It defines the DAC decoration which confirms that a function is âdefined and continuousâ on an interval (which can be restricted to just a single real number).
Also without decorations one could easily imagine a numerical epsilon-delta test for continuity: As you shrink your input interval around a suspect point on your function, the result interval should also shrink at the appropriate rate.
Is your setting that your function is stitched together of different definitions at certain nodes, like e.g. a spline? And you now want to do a quick plausibility check that everything matches up?
For a âmathematical / extensionalâ function defined on floating point numbers, i.e. a blackbox, there is not a lot more to do than comparing f(prevfloat(p)), f(p) and f(nextfloat(p)) to some cutoff (which may depend on f(p)). What cutoff is appropriate depends on your domain.
For a âmathematical / extensionalâ function defined on real numbers, I have a halting-problem oracle to sell you. Jokes aside, that doesnât work
For a AD-able / interval-math-able implementation of a function, there might be some clever definitions and tests. These tests may depend on the implementation of the function, not just its input-output behavior, same as all these things (you can implement f(x)=x as a lookup table on Float32, but that wonât AD).
AD is exact in exact arithmetic (i.e. in the absence of roundoff error). Richardson extrapolation (with a finite number of iterations) is not. So, they seem qualitatively different to me.
Richardson is blackbox: You apply it to a function, i.e. a relation.
AD applies to something like a circuit that computes a function, and outputs a circuit that computes / approximates the derivative. The same holds for interval math.