What do π\pi and ee stand for in the normal distribution formula?

I’m a beginner in mathematics and there is one thing that I’ve been wondering about recently. The formula for the normal distribution is:

f(x)=12πσ2e(xμ)22σ2,

However, what are e and π doing there? π is about circles and the ratio to its diameter, for example. e is mostly about exponential functions, specifically about the fact that ddxex=ex.

It is my firm conviction that proofs and articles are available, but could someone perhaps shed some light on this and please explain in a more ‘informal’ language what they stand for here?

I’m very curious to know as those numbers have very different meanings as far as I’m concerned.

Answer

So I think you want to know “why” π and e appear here based on an explanation that goes back to circles and natural logarithms, which are the usual contexts in which one first sees these.

If you see π, you think there’s a circle hidden somewhere. And in fact there is. As has been pointed out, in order for this expression to give a probability density you need f(x)dx=1. (I’m not sure how much you know about integrals — this just means that the area between the graph of f(x) and the x-axis is 1.) But it turns out that this can be derived from ex2dx=π.

And it turns out that this is true because the square of this integral is π. Now, why should the square of this integral have anything to do with circles? Because it’s the total volume between the graph of e(x2+y2) (as a function g(x,y) of two variables) and the xy-plane. And of course x2+y2 is just the square of the distance of (x,y) from the origin — so the volume I just mentioned is rotationally symmetric. (If you know about multiple integration, see the Wikipedia article “Gaussian integral”, under the heading “brief proof” to see this volume worked out.)

As for where e comes from — perhaps you’ve seen that the normal probability density can be used to approximate the binomial distribution. In particular, the probability that if we flip n independent coins, each of which has probability p of coming up heads, we’ll get k heads is
{n \choose k} p^{k} (1-p)^{n-k}
where {n \choose k} = n!/(k! (n-k)!). And then there’s
Stirling’s approximation,
n! \approx \sqrt{2\pi n} (n/e)^{n}.
So if you can see why e appears here, you see why it appears in the normal. Now, we can take logs of both sides of n! = 1 \cdot 2 \cdot \ldots \cdot n to get
\log (n!) = \log 1 + \log 2 + \cdots + \log n
and we can approximate the sum by an integral,
\log (n!) \approx \int_{1}^{n} \log t \: dt.
But the indefinite integral here is t \log t – t, and so we get the definite integral
\log (n!) \approx n \log n – n.
Exponentiating both sides gives n! \approx (n/e)^n. This is off by a factor of \sqrt{2\pi n} but at least explains the appearance of e — because there are logarithms in the derivation. This often occurs when we deal with probabilities involving lots of events because we have to find products of many terms; we have a well-developed theory for sums of very large numbers of terms (basically, integration) which we can plug into by taking logs.

Attribution
Source : Link , Question Author : pimvdb , Answer Author : hlapointe

Leave a Comment