We’ve just been learning about complex numbers in class, and I don’t really see why they’re called numbers.

Originally, a number used to be a means of counting (natural numbers).

Then we extend these numbers to instances of owing other people money for instance (integers).

After that, we consider fractions when we need to split things like 2 pizzas between three people.

We then (skipping the algebraic numbers for the purposes of practicality), we use the real numbers to describe any length; e.g. the length of a diagonal of a unit square.

But this is when our original definition of a number fails to make sense- when we consider complex numbers, which brings me to my main question: what is a rigorous definition of ‘number’?

Wikipedia claims that “A number is a mathematical object used to count, label, and measure”, but this definition fails to make sense after we extend R.

Does anyone know of any definition of number that can be generalised to complex numbers as well (and even higher order number systems like the quaternions)?

**Answer**

A basic question is, what would be the purpose of such a definition? Would it clarify anything if we came up with a definition that, say, included quaternions and not matrices or analytic functions?

Most of the usages of the term “number” are due to historical choices that have lived on. I’d be interested in seeing things that were called “numbers” initially, but are not called “numbers” now, I suppose, but any definition that applies is just a hack to justify choices at the boundaries, I think.

As mentioned, I haven’t seen quaternions called “numbers.” We say 1+i is a “complex number,” but we just say 1+i+j+k is a “quaternion.” At least in my experience.

**Algebraic numbers**

In “number theory,” we often deal with “algebraic extensions” of the rational numbers. For example, Q(√2) is the set of numbers of the form a+b√2,a,b∈Q. These can be seen as a subset of R, but they actually exist more abstractly – for example, algebraically, we don’t know whether √2<0 or √2>0 - the number exists as an algebraic object entirely - an object which, when squared, equals 2.

The same thing happens with Q(√−1). It would be strange to call √2 a "number" and not call √−1 a "number" in this context. Mathematicians call the two fields "algebraic number fields."

For example, Q(3√2) can be seen as isomorphic to a subset of R, but it is also isomorphic to a (different) subset of C.

**Complex Numbers**

There are also ways to see, inside the 'real numbers,' that the complex numbers sort of have to exist. My favorite way: If you look at the radius of convergence of the Taylor series of f(x)=1x2−x at x=a, you get the radius of convergence is min(|a|,|a−1|). That is, the zeros of the denominator "block" the Taylor series. If you look at the Taylor series of g(x)=11+x2 at a real number x=a, you get that the radius of convergence is √1+a2. There is something (a zero of 1+x2?) "blocking" the Taylor series of g(x) that looks like it is exactly a distance 1 from 0 in a direction perpendicular to the real line.

Complex numbers also are necessary for breaking down real matrices into component parts. Well, not "necessary," but the representation of matrices in, say, Jordan canonical form, becomes quite a bit more complicated without complex numbers. So complex numbers, oddly, make matrices seem more regular (or, if you prefer, hide the complexity.)

Also, complex numbers are really necessary for understanding quantum theory in physics. Everything you think you intuit about the universe, in terms of "measurements" being real numbers, starts to fall apart at the quantum level. The universe is far stranger than it seems.

**p-adic Numbers**

p-adic numbers are probably called "numbers" just because their construction is essentially the "same" as the construction of the reals, only using a different metric on Q, and because they can be used to answer questions about the natural numbers.

**Ordinals, Cardinals**

Ordinal and cardinal numbers represent a different type of extension of the natural numbers.

I think of "ordinals" as being like the results of a race with no ties. Every runner has a result "ordinal" and any non-empty subset of the runners has a "winner."

Cardinal numbers are like a pile of beans, and determining whether two piles of beans have the same amount in them.

Ordinals are by far the most weird, because even addition of ordinals is non-commutative.

In this case, then, ordinals and cardinals are "measurements" of something.

**Non-standard real number definitions**

There are also lots of variations of the real numbers that we call "numbers," basically because they are a variant of the real numbers.

**Conclusion: Exclusions**

The hardest part of coming up with a definition for "number" is to exclude: Why don't we call matrices, or functions, or other similar things "numbers?" Things we see as primarily functions are not seen as "numbers," but it is hard to exclude them with anything rigorous. Indeed, one way to see the complex numbers is as a sub-ring of the ring of real 2×2 matrices, and one reason we need complex numbers is that they are great at representing the operation of rotation - that is why we see them come in studying real matrices.

Zero-divisors are often a sign that a thing isn't a number, but we do have g-adic numbers with g not prime, which is a ring with zero divisors. (Usually, g-adic numbers are not actually used anywhere, since they are just products of rings of p-adic numbers...)

Does anybody refer to the elements of ring Z/nZ as "numbers?" Not in my experience.

I also haven't seen finite field elements referred to as "numbers."

So, no, the entire history of mathematics has not ascribed a single logical meaning to the word "number," so that we can distinguish what is and isn't a number. As noted in comments, "Cayley numbers" is another name for the octonions, but there are zero occurrences in Google NGram of the singular phrase, "Cayley number." So octonions are numbers, but a single octonion is not a "number?" That's just the world we live in. Number, being the most basic idea in mathematics, gets generalized in a lot of interesting ways, not all consistent, and not the same way over time.

Recall, the ancients didn't define 0 as a "number."

Q: How many beans do you have?

A: I don't have beans.

(I suspect this failure was due to the confusion between cardinals and ordinals - we count finite cardinals by arbitrarily sorting and then computing the ordinal of the last element, but that fails when counting an empty collection...)

**Attribution***Source : Link , Question Author : user164061 , Answer Author : Thomas Andrews*