# Different sizes of infinity

Correct me if I’m wrong, but this is what they taught us in precal:
$$\lim_{x\rightarrow\infty}x=\infty$$
$$\lim_{x\rightarrow\infty}x^{2}=\infty$$
But, we also know that $n^{2}>n$ if $n\notin [0,1]$

Does that mean that some infinities are greater than others? Why don’t we explicitly define infinity so that we can show differences in sizes?

The notation $\displaystyle \lim_{x\rightarrow\infty}f(x)=\infty$ where $f$ is any real-valued function of a real variable means exactly that as $x$ gets arbitrarily large, so does $f(x)$. That’s all it means. This usage has no relation to any metaphysical ideas about infinity.

It’s also the case that in the mathematical field of set theory, there is an entire elaborate theory of transfinite numbers. This is a very interesting area of math and the basics are accessible at an elementary level so do take a look at this if you’re interested. This usage of infinity is not related to the infinity of the first paragraph.

The observation you made about $x$ and $x^2$ happens to lead to yet another interesting area of math: namely the study of the rate at which functions grow. For example as you noted, the functions $x$ and $x^2$ each go to infinity (meaning they grow without bound) as $x$ gets large. And yet, $x^2$ grows “faster” than $x$ in some way.

Growth rates of functions have been studied since the late nineteenth century. They’ve always been important in some areas of pure math; and today they are of extreme interest in computer science.

Suppose we have two algorithms whose running time increases as a function of the length of their input (sorting a list, say). Then as the input gets larger, the algorithm that grows faster starts to take an impractical amount of time. Computer scientists are always interested in finding algorithms that grow slowly.

The growth rate of a given function is described by its Big-O notation. The study of which functions have roughly the same growth rate is computational complexity theory.