"gradcheck" in Flux?

Is there a function to check the Zygote gradient against a finite-difference?

I think I had seen something in discussions here but cannot find it, also not in the docs

There was a recent discussion on that might be helpful on Zulip: Checking adjoints

Some links from that discussion:

the gradtest function

A PR to use FiniteDifferences.jl

This is helpful…

… however it seems difficult how to use these functions to check a constructed model that is differentiated with respect to Flux.params() or one of the other 2 schemes mentioned in the Zygote docs. I am trying to figure out how to adapt the idea, if it is possible. Probably I am overlooking the obvious!

(I do not have a clear statement of the deeper problem yet, but the surface problem is that the ngradient function evaluates a function that accepts arrays as its arguments, whereas the typical flux of writing an ML model does not result in functions that take their parameter arrays as arguments)