Reverse mathematics of Cousin's lemma
Sam Sanders here, one of the authors of the paper you mention. Thanks for the nice words. I will answer your questions based on my personal opinion.
You write:
[...] would like to know if it has impacted the reverse mathematics (RM) program, and what its significance is seen as. Could path integrals really require such powerful axioms?
First of all, I cannot speak for the RM community. What I can tell you is that we have gotten all sorts of comments, negative and positive, from lots of (senior) people. I believe it is also fair to say that our paper (and related results) shows that coding even e.g. basic analysis in second-order RM does not accurately represent mathematics. There are those that disagree, as one would expect, but I would speculate people in RM have (on average) started working on "less coding intense" topics. Part of the problem is that second-order arithmetic cannot directly deal with functions from $\mathbb{R}$ to $\mathbb{R}$ that are discontinuous, which is discussed in detail in my "splittings and disjunctions" paper (see arxiv/NDJFL).
Secondly, regarding your question on "powerful" axioms. This ties in nicely with recent results Dag Normann and I have. In a nutshell, the usual scale for measuring logical strength (based on comprehension and discontinuous functionals) is not satisfactory for e.g. Cousin's lemma and we need a second scale, based on (classically valid) continuity axioms from Brouwer's intuitionistic mathematics. This new scale is a Platonist's dream: the canonical ECF embedding maps part of this new scale and equivalences to the 'Big Five' and their equivalences. In other words, the Big Five are merely a reflection of a higher-order truth!
Let me first sketch the main results in our paper pertaining to Cousin's lemma.
We work in the language of higher-order arithmetic. This means that all the below should be interpreted in Kohlenbach's higher-order RM and Kleene's higher-order computability (S1-S9). Not much specific knowledge of these frameworks is needed, however.
Let HBU be Cousin's lemma for [0,1], i.e. for any $\Psi:[0,1] \rightarrow \mathbb{R}^+$, there are $y_0, ..., y_k \in [0,1]$ such that $\cup_{i\leq k} B(y_i, \Psi(y_i))$ covers $[0,1]$. In other words, the reals $y_0, ..., y_k \in [0,1]$ provide a finite sub-covering of the uncountable covering $\cup_{x\in [0,1]}B(x,\Psi(x))$.
Let $Z_2$ be second-order arithmetic with language $L_2$. The systems $Z_2^\omega$ and $Z_2^\Omega$ are known conservative extensions of $Z_2$. Then the former cannot prove HBU while the latter can. This is what we mean by "a proof of HBU requires full second-order arithmetic", as $Z_2^\omega$ cannot prove HBU, but $Z_2^\Omega$ can.
Clearly, HBU is a statement in the language of third-order arithmetic. The system $Z_2^\omega$ is also third-order in nature: it includes, for any $k \geq 1$, a third-order functional $S_k$ that decides $\Pi_k^1$-formulas from $L_2$ in Kleene normal form (see the work of Sieg & Feferman). The system $Z_2^\Omega$ is fourth-order, as it is based on Kleene's (comprehension) quantifier $\exists^3$ (see the work of Kleene on higher-order recursion theory). Note in particular that HBU is provable in ZF: countable choice is not needed.
There are many statements that exhibit the same (or similar) behaviour as HBU. I refer to e.g. our paper on Pincherle's theorem (APAL) and open sets (JLC), where whole lists can be found, as well as the original paper. Convergence theorems for nets in $[0,1]$ indexed by Baire space also behave like HBU (see my 2019 CiE and WolliC papers).
Now that we have established the results, let me explain what these results mean. Indeed, there is an apparent contradiction here: on one hand, HBU should be intuitively weak, but we need absurdly strong comprehension axioms to prove HBU. This feeling you express in your posting, I believe.
The fundamental problem is that we are comparing apples and oranges as follows:
The aforementioned comprehension functionals $\exists^3$ and $S_k$ are discontinuous (in the usual sense of mathematics). By contrast, HBU does not imply the existence of a discontinuous function (say on $\mathbb{R}$ or Baire space). Let us call a (third-order) theorem 'normal' if it implies the existence of a discontinuous function on $\mathbb{R}$, and 'non-normal' otherwise.
It is an empirical observation that there are many non-normal theorems (like HBU) that cannot be proved in $Z_2^\omega$, but can be proved in $Z_2^\Omega$. In other words, the usual 'normal' scale based on comprehension functionals is not a good scale for analysing the strength of non-normal theorems.
In a nutshell: normal theorems = apples and non-normal theorems = oranges.
An obvious follow-up question is:
What is a good scale for analysing non-normal theorems?
As explored in the following paper (see Section 5), the neighbourhood function principle NFP provides the right scale.
https://arxiv.org/abs/1908.05676
NFP is a classically valid continuity axiom from Brouwer's intuitionistic mathematics.
Fragments of NFP are equivalent to e.g. HBU and other milestone non-normal theorems, like the monotone convergence theorem for nets in [0,1] (called $\textsf{MCT}_{\textsf{net}}$ in the above paper). Note that NFP was introduced under a different name by Troelstra-Kreisel, and is mentioned in Troelstra & van Dalen.
Finally, the Kleene-Kreisel 'ECF' embedding is the canonical embedding of higher-order into second-order arithmetic. It maps third-order and higher objects to second-order associates/RM-codes, which reflects the 'coding practise' of RM.
What is more, the ECF embedding maps equivalences involving HBU to equivalences involving WKL, as follows:
HBU $\leftrightarrow$ Dini's theorem for nets (indexed by Baire space).
is mapped by ECF to
HBC $\leftrightarrow$ Dini's (usual) theorem (for sequences),
where HBC is the Heine-Borel theorem for countable coverings of intervals.
Another example is the following:
RANGE $\leftrightarrow$ Monotone convergence theorem for nets (indexed by Baire space)
is mapped by ECF to
range $\leftrightarrow$ Monotone convergence theorem (for sequences),
where RANGE states that the range of a third-order function exists, while range states that the range of a (second-order) function exists; it is well-known that range $\leftrightarrow \textsf{ACA}_0$.
In general, the Big Five equivalences are a reflection of higher equivalences under ECF as in the following picture:
Since ECF is a lossy translation, this pictures resembles -in my not so humble opinion- the allegory of the cave by Plato. This observation is inspired by Steve Simpson's writings on Aristotle that can be found in the (RM) literature.
I would like to finish this answer with a history lesson: I have heard wildly inaccurate claims from very smart (RM) people about the history of mathematics. These claims are often used to justify the coding practise of RM. So let us set the historical record straight, while it still matters.
1) Hilbert and Bernays did NOT introduce second-order arithmetic. In the "Grundlagen der Mathematik", they formalise a bunch of math in a logical system $H$ (see esp. Supplement IV). This system involves third-order parameters in an essential way, as has been observed before by e.g. Sieg (see e.g. his book on Hilbert's program). Hilbert-Bernays vaguely sketch how one could perhaps do the same formalisation with less.
I was told that Kreisel then introduced second-order arithmetic based on the above.
2) Riemann's Habilschrift established discontinuous functions as part of the mathematical mainstream around 1850. Thus, discontinuous functions definitely predate set theory.
3) The modern concept of function was already formulated by Dedekind and Lobasjevski in the 1830ies. (This view is not without its critics)
4) The gauge integral is more general than the Lebesgue integral. The main theorems (Hake and FTC) of the former in particular apply to any (possibly non-measurable) function. In this way, the development of the gauge integral does not need measure theory, but can instead be done similarly to the Riemann integral. To study the gauge integral restricted to measurable functions goes against its generalist/historical spirit, to say the least.
5) There are a number of formalism to give meaning to Feynman's path integral. The gauge integral is heralded as one of the few that can avoid 'imaginary time', a desirable feature from the pov of physics. This is briefly discussed on page 20 here:
https://arxiv.org/abs/1711.08939
References are provided, of course.
Sam's answer is, of course, the definitive one. For curiosity's sake, I'll mention that Rod Downey, Noam Greenberg and I have recently been looking at Cousin's lemma for restricted (i.e. cardinality $\mathfrak{c}$) classes of functions.
We have a proof of Cousin's lemma in $\Pi^1_1$-$\mathsf{CA}_0$ that works for any function, to the extent that "any function" can be formalised in second-order arithmetic. This shows $\Pi^1_1$-$\mathsf{CA}_0$ is an upper bound for Cousin's lemma for any class of functions definable in second-order arithmetic. This doesn't contradict what Sam said above, since you need third-order arithmetic to talk about all functions, and then $\Pi^1_1$-$\mathsf{CA}_0$ is not enough.
We've specifically focused on continuous functions and Baire functions. So far, we've proved:
- Cousin's lemma for continuous functions is equivalent to $\mathsf{WKL}_0$;
- Cousin's lemma for Baire 1 functions is between $\mathsf{ACA}_0$ and $\Pi^1_1$-$\mathsf{CA}_0$;
- Cousin's lemma for Baire $n$ functions, $n \geq 2$, is between $\mathsf{ATR}_0$ and $\Pi^1_1$-$\mathsf{CA}_0$.
These results are in my thesis, and we're currently writing them up for publication.
To answer the OP's questions, Dag Normann and Sam Sanders' results do contradict (a). But to me, this is not such a surprise. As Sam notes, arbitrary discontinuous functions can't be formalised in second-order arithmetic. So, to the extent that classical analysis deals with discontinuous functions, we can't even formalise it in second-order arithmetic, let alone prove its theorems in, say, $\Pi^1_1$-$\mathsf{CA}_0$. This just shows that SOA is sufficient for most, but not all, of mathematics (topology, set theory, etc).
As for (b), it also contradicts this, if you assume that it is really necessary to consider Cousin's lemma for arbitrary gauges. As Russell A. Gordon shows in his book, "The Integrals of Lebesgue, Denjoy, Perron, and Henstock", it is enough to consider measurable gauges, and maybe this can be further restricted to Borel gauges (I don't know). In that case, maybe $\mathsf{ATR}_0$ or $\Pi^1_1$-$\mathsf{CA}_0$ is enough for the physicists.