How helpful is non-standard analysis?

The other answers are excellent, but let me add a few points.

First, with a historical perspective, all the early fundamental theorems of calculus were first proved via methods using infinitesimals, rather than by methods using epsilon-delta arguments, since those methods did not appear until the nineteenth century. Calculus proceeded for centuries on the infinitesimal foundation, and the early arguments---whatever their level of rigor---are closer to their modern analogues in nonstandard analysis than to their modern analogues in epsilon-delta methods. In this sense, one could reasonably answer your question by pointing to any of these early fundamental theorems.

To be sure, the epsilon-delta methods arose in part because mathematicians became unsure of the foundational validity of infinitesimals. But since nonstandard analysis exactly provides the missing legitimacy, the original motivation for adopting epsilon-delta arguments appears to fall away.

Second, while it is true that almost any application of nonstandard analysis in analysis can be carried out using standard methods, the converse is also true. That is, epsilon-delta arguments can often also be translated into nonstandard analysis. Furthermore, someone raised with nonstandard analysis in their mathematical childhood would likely prefer things this way. In this sense, the preference between the two methods may be a cultural matter of upbringing.

For example, H. Jerome Keisler wrote an introductory calculus textbook called Elementary Calculus: an infinitesimal approach, and this text was used for many years as the main calculus textbook at the University of Wisconsin, Madison. I encourage you to take a look at this interesting text, which looks at first like an ordinary calculus textbook, except that in the inside cover, next to the various formulas for derivatives and integrals, there are also listed the various rules for manipulating infinitesimals, which fill the text. Kiesler writes:

This is a calculus textbook at the college Freshman level based on Abraham Robinson's infinitesimals, which date from 1960. Robinson's modern infinitesimal approach puts the intuitive ideas of the founders of the calculus on a mathematically sound footing, and is easier for beginners to understand than the more common approach via limits.

Finally, third, some may take your question to presume that a central purpose of nonstandard analysis is to provide applications in analysis. But this is not correct. The concept of nonstandard models of arithmetic, of analysis and of set theory arose in mathematical logic and has grown into an entire field, with hundreds of articles and many books, with its own problems and questions and methods, quite divorced from any application of the methods in other parts of mathematics. For example, the subject of Models of Arithmetic is focused on understanding the nonstandard models of the first order Peano Axioms, and it makes little sense to analyze these models using only standard methods.

To mention just a few fascinating classical theorems: every countable nonstandard model of arithmetic is isomorphic to a proper initial segment of itself (H. Friedman). Under the Continuum Hypothesis, every Scott set (a family of sets of natural numbers closed under Boolean operations, Turing reducibility and satisfying Konig's lemma) is the collection of definable sets of natural numbers of some nonstandard model of arithmetic (D. Scott and others). There is no nonstandard model of arithmetic for which either addition or multiplication is computable (S. Tennenbaum). Nonstandard models of arithmetic were also used to prove several fascinating independence results over PA, such as the results on Goodstein sequences, as well as the Paris-Harrington theorem on the independence over PA of a strong Ramsey theorem. Another interesting result shows that various forms of the pigeon hole principle are not equivalent over weak base theories; for example, the weak pigeon-hole principle that there is no bijection of n to 2n is not provable over the base theory from the weaker principle that there is no bijection of n with n2. These proofs all make fundamental use of nonstandard methods, which it would seem difficult or impossible to omit or to translate to standard methods.


From the Wikipedia article:

the list of new applications in mathematics is still very small. One of these results is the theorem proven by Abraham Robinson and Allen Bernstein that every polynomially compact linear operator on a Hilbert space has an invariant subspace. Upon reading a preprint of the Bernstein-Robinson paper, Paul Halmos reinterpreted their proof using standard techniques. Both papers appeared back-to-back in the same issue of the Pacific Journal of Mathematics. Some of the ideas used in Halmos' proof reappeared many years later in Halmos' own work on quasi-triangular operators.


Nonstandard hulls of spaces are used all the time in Banach space theory, so much so that books devote sections to the construction of ultraproducts of Banach spaces (e.g. Absolutely summing operators by Diestel, Jarchow, and Tonge). There are cases where NSA is used to prove the existence of an estimate, yet no one knows how directly to compute an estimate. For example, the unconditional constant of any basis for the span of the first n unit basis vectors in the James' space of sequences of bounded quadratic variation must go to infinity, but the only known proof involves NSA.