The article you link compares Clenshaw-Curtis to GQ, while I was comparing GQ to the naive approach of “simply evaluate the function a hundred or so times along an interval, and then fit a high order polynomial via regression”. The article you link is super-interesting, but I still think that GQ dominates the solution suggested (even though Clenshaw-Curtis may be better).
The article just says the errors are comparable. Since FastGaussQuadrature.jl is O(n) while Clenshaw-Curtis is O(n*log n), which one is better depends on n.
(ApproxFun is essentially doing adaptive Clenshaw-Curtis when you use it for integrals. Note that when you have a Fun
The transform step has been done so it goes down to O(n).)
Thank you very much! Will try to read.
I realize I’m a little late to the conversation, but I just thought I’d add that I was aware that the @clenshaw macro in ApproxFun grew the intermediate expressions exponentially (before the compiler untangles it and produces nice O(n) machine code). I posted in the old julia-users group to seek help as it was just a personal experiment with meta-programming.
If a good solution is available, a PR in ApproxFun would be appreciated!
Sure, I believe the code snippet I posted above works and is faster than a simple function, I’ll test a bit more and make a PR.
EDIT: here it is.