Numerical Differentiation. What is the best method?

If your function is badly behaved (e.g. noisy, very oscillatory), no method will perform properly (differentiation is numerically very unstable). That being said, for "nice functions", I have good experience with polynomial (Richardson) extrapolation methods. This paper and this paper give hints on how you might write your own implementation. I will note that this is the method implemented in the NAG numerical libraries (with of course a few wrinkles of their own).

There are two possible alternatives if for some reason you don't want to use Richardsonian methods. One is to use Cauchy's differentiation formula:

$$f^\prime(x)=\frac1{2\pi i}\oint_\gamma \frac{f(t)}{(t-x)^2}\mathrm dt$$

where it is up to you to choose a suitable counterclockwise contour $\gamma$ (a circle is customary); the other is to use the "Lanczos derivative":

$$f^\prime(x)=\lim_{h\to 0}\frac{3}{2h^3}\int_{-h}^h t\;f(x+t)\mathrm dt$$

where you either will have to experiment with an appropriate step size $h$, or use some extrapolative procedure.

You will have to experiment with your computing environment to choose.


Use auto-differentiation. Automatic differentiation is faster than other forms of differentiation and gets errors at machine epsilon. However, it's much harder to implement, so you might need to use a package for it. That said, since it's so useful there are plenty of packages, one I use very often is ForwardDiff.jl.

It's a shame that less academics don't use automatic differentiation because it really is what should be the workhorse, and it's just not taught in graduate school for some reason (though those who go into machine learning know it as "backwards propagation").


Edit

One can autodifferentiate functions from libraries in Julia since the libraries themselves are written in Julia and are "duck-typed", and so ForwardDiff.jl can auto-differentiate library functions (including ones which use loops, conditionals, etc.) due to its type system. The only problem it runs into is when there are C/Fortran libraries called, so really you just have to avoid library functions which internally call BLAS (so no matrix multiplications and you're good).

Edit 2

ForwardDiff will now use generic linear algebra fallback function to compute the derivatives, so those will be fine as well. The only Base library functions I can think of where automatic differentiation will fail now (in Julia) is when trying to differentiate a function which calculates an FFT, since that uses FFTW and doesn't have a Julia fallback (for now).


In addition to J.M. answer: this paper http://www.sciencedirect.com/science/article/pii/S0021904512000123 (Differentiation by integration using orthogonal polynomials, a survey, by E. Diekema and T.H. Koornwinder) provides further insight into Lánczos's derivative and its generalizations for n-th order derivatives. By the way, it says that the Lánczos's (1956) derivative formula goes back to Cioranescu (1938) and Haslam-Jones (1953).