Did Joseph Fourier ever make a pure mathematical mistake?

Anyone back then working on anything that had anything to do with calculus or related topics could hardly avoid making mistakes, since there simply was no logically coherent formulation of the basic definitions at that time. Trying to prove something about continuous functions without a definition of continuity is going to lead to problems.

Fourier in particular is famous for stating that any periodic function is equal to the sum of its Fourier series. This is nonsense (see the comment below). But it's one of the all-time great errors. Trying to make sense of this, to see what could actually be proved in this direction, was one motivation for the development of modern rigorous analysis. In fact sorting this out was part of the motivation for at least three major developments that spring to mind:

  • People like Cauchy, Weierstrass et al invent epsilons and deltas. Now we can actually state and prove things about calculus rigorously.

  • But the theory of Fourier series, although it now made sense logically, still didn't work as well as we'd like; Lebesgue and others invent the Lebesgue integral and the theory of Fourier series gets a big boost.

  • Cantor was actually led to set theory, in particular transfinite numbers, in the course of investigations into Fourier series! (When you're studying sets of uniqueness for trig series the notion of the "derived set" $E'$ of $E$ comes up; this is the set of limit points of $E$. Then one can consider $E''$, etc; this leads naturally to a study of $E^\alpha$ for infinite ordinals $\alpha$.)

(The first two items above are hugely well known. For more on the third, regarding Cantor, set theory and Fourier series, you might look here or here. Will R suggests you look here; I haven't seen that, internet too slow for YouTube, but a lecture by Walter Rudin on the topic is certain to be great.)


Comment I had no idea that the assertion that there exists a (continuous) function with a divergent Fourier series would be controversial. Writing down an explicit example is not easy; any continuous function that Fourier ever encountered does have a convergent Fourier series.

But proving the existence is very simple, from the right point of view. Say $s_n(f)$ is the $n$-th partial sum of the Fourier series for $f$ and $D_n$ is the Dirichlet kernel, so that $$s_n(f)(0)=\frac1{2\pi}\int_0^{2\pi}f(t)D_n(t)\,dt.$$The norm of $s_n(f)(0)$ as a linear functional on $C(\Bbb T)$ is the same as the norm of $D_n$ regarded as a complex measure, which is in turn equal to $\|D_n\|_1$. It's easy to see that $\|D_n\|_1\ge c\log n$. So the Uniform Boundedness Principle, aka the Banach-Steinhaus Theorem, shows that there exists $f\in C(\Bbb T)$ such that $s_n(f)$ is unbounded.


Presumably the reference is to:

  • Imre Lakatos, Proofs and refutations: The logic of mathematical discovery (1977), Appendix 1.1 Cauchy's Defence of the 'Principle of Continuity', page 127-on,

referring to Fourier's example of convergent series of continuous functions which tends to a Cauchy discontinuous function, into Fourier's Mémoire sur la Propagation de la Chaleur (1812).

But we are not speaking of "mistakes" as calculation errors or things like that; what Lakatos is discussing are "exceptions" to some general theorem where the proof neglects some condition necessary for the general validity of the proof.


Fourier was right about his conjecture concerning expanding functions in a Fourier series, when you consider what the "functions" were at the time. It's fun to pounce on Fourier, and I noticed a lot of people have drunk that Kool-Aid. However, the facts don't support what people typically claim. Set Theory didn't exist, and general functions were not conceived. The general functions at the time of Fourier were piecewise arcs, and Fourier did demonstrate the convergence to the mean of the left and right limits for such functions. It is false that Dirichlet gave the first proof of this fact. In fact, Dirichlet's proof was almost identical to that given by Fourier, and Fourier gave the "Dirichlet kernel." It is very possible that Dirichlet took his proof from Fourier's manuscript that had been denied publication. It is true that Fourier also gave several wrong demonstrations, but the Dirichlet kernel proof should really be called the Fourier kernel.

Imagine doing what Fourier did in a time when the following had not yet been defined: (1) The Riemann integral (2) a Real Number (3) Set Theory and general functions (4) Completion of a space and Convergence of a Cauchy Sequence (5) Functional Analysis (6) Inner Product Space (7) The Cauchy-Schwarz inequality. It's important to keep Historical perspective, and to keep in mind that a large part of Analysis came out of trying to resolve Fourier's claims.

Quoting from the well-regarded 1926 Introduction to the Theory of Fourier's Series and Integrals by H. S. Carslaw,

Debunking False Claims