As a former physics major, I did a lot of (seemingly sloppy) calculus using the notion of infinitesimals. Recently I heard that there is a branch of math called non-standard analysis that provides some formalism to this type of calculus.

So, do you guys think it is a subject worth learning? Is it a branch that is growing and becoming interconnected with other branches of math? Does it make calculus any easier?

**Answer**

I think this (interesting!) question is yet-another instance of the occasional mismatch of science (and human perception) and formal mathematics. For example, the arguments used by the questioner, and common throughout science and engineering, were also those used by Euler and other mathematicians for 150 years prior to Abel, Cauchy, Weierstrass, and others’ “rigorization” of calculus. The point is that the extreme usefulness and power of calculus and differential equations was illustrated *prior* to epsilon-delta proofs!

Similarly, c. 1900 Heaviside’s (and others’) use of derivatives of not-differentiable functions, of the “Dirac delta” functions and its derivatives, and so on, brought considerable ridicule on him from the mathematics establishment, but his mathematics worked well enough to build the transatlantic telegraph cable. “Justification” was only provided by work of Sobolev (1930s) and Schwartz (1940s).

And I think there are still severe problems with Feynman diagrams, even tho’ he and others could immediately use them to obtain correct answers to previously-thorny quantum computations.

One conclusion *I* have reached from such stories is that we have less obligation to fret, if we have a reasonable physical intuition, than undergrad textbooks would make us believe.

But, back to the actual question: depending on one’s tastes, non-standard analysis can be pretty entertaining to study, especially if one does not worry about the “theoretical” underpinnings. However, to be “rigorous” in use of non-standard analysis requires considerable effort, perhaps more than that required by other approaches. For example, the requisite model theory itself, while quite interesting if one finds such things interesting, is non-trivial.

In the early 1970s, some results in functional analysis were obtained first by non-standard analysis, raising the possibility that such methods would, indeed, provide means otherwise unavailable. However, people found “standard” proofs soon after, and nothing comparable seems to have happened since, with regard to non-standard analysis.

With regard to model theory itself, the recent high-profile proof of the “fundamental lemma” in Langlands’ program did make essential use of serious model theory… and there is no immediate prospect of replacing it. That’s a much more complicated situation, tho’.

With regard to “intuitive analysis”, my current impression is that learning an informal version of L. Schwartz’ theory of distributions is more helpful. True, there are still issues of underlying technicalities, but, for someone in physics or mechanics or PDE… those underlying technicalities *themselves* have considerable physical sense, as opposed to purely mathematical content.

Strichartz’ nice little book “A guide to distribution theory and Fourier transforms” captures the positive aspect of this, to my taste, altho’ the Sobolev aspect is equally interesting. And, of course, beware the ponderous, lugubrious sources that’ll make you sorry you ever asked… 🙂 That is, *anything* can be turned into an ugly, technical mess in the hands of someone who craves it! 🙂

So, in summary, I think some parts of “modern analysis” (done lightly) more effectively fulfill one’s intuition about “infinitesimals” than does non-standard analysis.’

**Attribution***Source : Link , Question Author : Community , Answer Author :
2 revs*