Is mathematical history written by the victors?

Certainly the victors write the history, generally. But when the victory is so complete that there is no further threat, the victors sometimes feel they can beneficently tolerate "docile" dissent. :)

Srsly, folks: having been on various sides of such questions, at least as an interested amateur, and having wanted new-and-wacky ideas to work, and having wanted a successful return to the intuition of some of Euler's arguments ... I'd have to say that at this moment the Schwartz-Grothendieck-Bochner-Sobolev-Hilbert-Schmidt-BeppoLevi (apologies to all those I left out...) enhancement of intuitive analysis is mostly far more cost-effective than various versions of "non-standard analysis".

In brief, the ultraproduct construction and "the rules", in A. Robinson's form, are a bit tricky (for people who have external motivation... maybe lack training in model theory or set theory or...) Fat books. Even the dubious "construction of the reals" after Dedekind or Cauchy is/are less burdensome, as Rube-Goldberg as they may seem.

Nelson's "Internal Set Theory" version, as illustrated very compellingly by Alain Robert in a little book on it, as well, achieves a remarkable simplification and increased utility, in my opinion. By now, having spent some decades learning modern analysis, I do hopefully look for advantages in non-standard ideas that are not available even in the best "standard" analysis, but I cannot vouch for any ... yet.

Of course, presumably much of the "bias" is that relatively few people have been working on analysis from a non-standard viewpoint, while many-many have from a "standard" viewpoint, so the relative skewing of demonstrated advantage is not necessarily indicative...

There was a 1986 article by C. Henson and J. Keisler "on the strength of non-standard analysis", in J. Symbolic Logic, 1986, maybe cited by A. Robert?... which follows up on the idea that a well-packaged (as in Nelson) version of the set-theoretic subtley of existence of an ultraproduct is (maybe not so-) subtly stronger than the usual set-theoretic riffs we use in "doing analysis", even with AxCh as usually invoked, ... which is mostly not very serious for any specific case. I have not personally investigated this situation... but...

Again, "winning" is certainly not a reliable sign of absolute virtue. Could be a PR triumph, luck, etc. In certain arenas "winning" would be a stigma...

And certainly the excesses of the "analysis is measure theory" juggernaut are unfortunate... For that matter, a more radical opinion would be that Cantor would have found no need to invent set theory and discover problems if he'd not had a "construction of the reals".

Bottom line for me, just as one vote, one anecdotal data point: I am entirely open to non-standard methods, if they can prove themselves more effective than "standard". Yes, I've invested considerable effort to learn "standard", which, indeed, are very often badly represented in the literature, as monuments-in-the-desert to long-dead kings rather than useful viewpoints, but, nevertheless, afford some reincarnation of Euler's ideas ... albeit in different language.

That is, as a willing-to-be-an-iconoclast student of many threads, I think that (noting the bias of number-of-people working to promote and prove the utility of various viewpoints!!!) a suitably modernized (= BeppoLevi, Sobolev, Friedrichs, Schwartz, Grothendieck, et al) epsilon-delta (=classical) viewpoint can accommodate Euler's intuition adequately. So far, although Nelson's IST is much better than alternatives, I've not (yet?) seen that viewpoint produce something that was not comparably visible from the "standard" "modern" viewpoint.


To give an example of the kind of answer requested here, note that one of the first examples in the NAMS text is from David Mumford, who wrote about overcoming his own prejudice (stemming from what he was taught concerning infinitesimals) in the following terms: "In my own education, I had assumed that Enriques [and the Italians] were irrevocably stuck.… As I see it now, Enriques must be credited with a nearly complete geometric proof using, as did Grothendieck, higher order infinitesimal deformations.… Let’s be careful: he certainly had the correct ideas about infinitesimal geometry, though he had no idea at all how to make precise definitions."

I enjoyed paul garrett's answer though it is steered in a slightly different direction, namely the effectiveness of NSA in cutting-edge research, whereas my question is mostly concerned with historical interpretation and getting an accurate picture of the mathematical past.

To give another example, Fermat's procedure of adequality involves a step where Fermat drops the remaining "E" terms; he carefully chooses his terminology and does not set them equal to zero. Similar remarks apply to Leibniz. Yet historians often assume that there is a logical contradiction involved at the basis of their methods, which can be summarized in the notation of modern logic as $(dx\not=0)\wedge(dx=0)$. Such remarks often go hand-in-hand with claims that the alleged logical contradiction was finally resolved around 1870. Without detracting from the greatness of the accomplishment around 1870, such criticism of the early pioneers of the calculus may not be on target.


(This is meant as a response to a comment by Pete L. Clark on whether the history of analysis was a "linear progression". Due to its length I decided to post it as a separate answer) I agree that focusing on the term "linear" is not the issue. What does seem to be a meaningful issue is the following closely related question.

Is it accurate to view the formalisation of analysis around 1870, an extremely important development by all accounts, as having established a "true" foundation of analysis in the context of the Archimedean continuum and by eliminating infinitesimals?

An alternative view is that the success of the Archimedean formalisation in fact incorporated an aspect of failure, as well, namely a failure to formalize a ubiquitous aspect of analysis as it had been practiced since 1670: the infinitesimal.

According to the alternative view, there is not one strand but two parallel strands for the development of analysis, one in the context of an Archimedean continuum, as formalized around 1870, and one in the context of what could be called a Bernoullian continuum (Johann Bernoulli having been the first to base analysis systematically and exclusively on a system incorporating infinitesimals). This strand was not formalized until the work of Edwin Hewitt in the 1940s, Jerzy Los in the 1950s, and especially Abraham Robinson in the 1960s, but its sources are already in the work of the great pioneers of the 17th century.

To give an example, In his recent article (Gray, J.: A short life of Euler. BSHM Bulletin. Journal of the British Society for the History of Mathematics 23 (2008), no. 1, 1--12), Gray makes the following comment:

"At some point it should be admitted that Euler's attempts at explaining the foundations of calculus in terms of differentials, which are and are not zero, are dreadfully weak" (p. 6). He provides no evidence for this claim.

It seems to me that Gray's sweeping claim is coming from a "linear progression" school of thinking where Weierstrass is credited with eliminating logically faulty infinitesimals, so of course Euler who used infinitesimals galore would necessarily be "dreadfully weak" without any further explanation needed.