# Proofs of AM-GM inequality

The arithmetic – geometric mean inequality states that

I’m looking for some original proofs of this inequality. I can find the usual proofs on the internet but I was wondering if someone knew a proof that is unexpected in some way. e.g. can you link the theorem to some famous theorem, can you find a non-trivial geometric proof (I can find some of those), proofs that use theory that doesn’t link to this inequality at first sight (e.g. differential equations …)?

Induction, backward induction, use of Jensen inequality, swapping terms, use of Lagrange multiplier, proof using thermodynamics (yeah, I know, it’s rather some physical argument that this theorem might be true, not really a proof), convexity, … are some of the proofs I know.

This is a fairly old answer of mine with a proof that was not very motivated, so I thought I’d update it with some geometric intuition about convexity.

Consider for simplicity the two-variable case $$(a+b)/2≥√ab(a+b)/2 \ge \sqrt{ab}$$ and fix, say $$a=1a = 1$$. The plot of $$(1+b)/2(1+b)/2$$ and $$√b\sqrt{b}$$ show intuitively how the concave nature of the geometric mean implies it will always lie below the arithmetic mean. Also observe the equality at one point. In fact, this concavity will extend to any number of variables, but obviously a plot is not a proof.

The proof for more than two variables presented requires elementary properties of logarithms which transforms dealing with multiplication to dealing with addition. The inequality is rewritten in terms of logarithms:

$$x1+⋯+xnn≥(x1…xn)1/n \frac{x_1 + \dots + x_n}{n}\ge (x_1 \dots x_n)^{1/n}$$

Taking logs preserves the inequality since $$log\log$$ is an increasing function:

$$⟺log(x1+⋯+xnn)≥1nlog(x1…xn)=logx1+⋯+logxnn\iff \log \left(\frac{x_1 + \dots + x_n}{n}\right) \ge \frac 1 n \log (x_1 \dots x_n) = \frac{\log x_1 + \dots + \log x_n}{n}$$

$$\DeclareMathOperator{\E}{E}$$
If we write $$E[X]\E[X]$$ as the mean of $$xix_i$$‘s and $$E[log(X)]\E[\log(X)]$$ as the mean of $$logxi\log x_i$$‘s, we can also understand this in the language of expectation:

$$log(E[X])≥E[log(X)]\log(\E[X]) \ge \E[\log (X)]$$

Using the concavity of $$log\log$$, by Jensen’s Inequality (proved inductively starting from the definition of convexity), the inequality holds.

Original post of Pólya’s Proof, using similar ideas of convexity of $$exe^x$$:

Let $$f(x)=ex−1−xf(x) = e^{x-1}-x$$. The first derivative is $$f′(x)=ex−1−1f'(x)=e^{x-1}-1$$ and the second derivative is $$f″f''(x) = e^{x-1}$$.

$$ff$$ is convex everywhere because $$f”(x) > 0f''(x) > 0$$, and has a minimum at $$x=1x=1$$. Therefore $$x \le e^{x-1}x \le e^{x-1}$$ for all $$xx$$, and the equation is only equal when $$x=1x=1$$.

Using this inequality we get

$$\frac{x_1}{a} \frac{x_2}{a} \cdots \frac{x_n}{a} \le e^{\frac{x_1}{a}-1} e^{\frac{x_2}{a}-1} \cdots e^{\frac{x_n}{a}-1}\frac{x_1}{a} \frac{x_2}{a} \cdots \frac{x_n}{a} \le e^{\frac{x_1}{a}-1} e^{\frac{x_2}{a}-1} \cdots e^{\frac{x_n}{a}-1}$$

with $$aa$$ being the arithmetic mean. The right side simplifies

$$\exp \left(\frac{x_1}{a} -1 \ +\frac{x_1}{a} -1 \ + \cdots + \frac{x_n}{a} -1 \right)\exp \left(\frac{x_1}{a} -1 \ +\frac{x_1}{a} -1 \ + \cdots + \frac{x_n}{a} -1 \right)$$

$$=\exp \left(\frac{x_1 + x_2 + \cdots + x_n}{a} – n \right) = \exp(n – n) = e^0 = 1=\exp \left(\frac{x_1 + x_2 + \cdots + x_n}{a} - n \right) = \exp(n - n) = e^0 = 1$$

Going back to the first inequality

$$\frac{x_1x_2\cdots x_n}{a^n} \le 1\frac{x_1x_2\cdots x_n}{a^n} \le 1$$

So we end with

$$\sqrt[n]{x_1x_2\cdots x_n} \le a\sqrt[n]{x_1x_2\cdots x_n} \le a$$