Examples of patterns that eventually fail

Often, when I try to describe mathematics to the layman, I find myself struggling to convince them of the importance and consequence of “proof”. I receive responses like: “surely if Collatz is true up to $20×2^{58}$, then it must always be true?”; and “the sequence of number of edges on a complete graph starts $0,1,3,6,10$, so the next term must be 15 etc.”

Granted, this second statement is less logically unsound than the first since it’s not difficult to see the reason why the sequence must continue as such; nevertheless, the statement was made on a premise that boils down to “interesting patterns must always continue”.

I try to counter this logic by creating a ridiculous argument like “the numbers $1,2,3,4,5$ are less than $100$, so surely all numbers are”, but this usually fails to be convincing.

So, are there any examples of non-trivial patterns that appear to be true for a large number of small cases, but then fail for some larger case? A good answer to this question should:

  1. be one which could be explained to the layman without having to subject them to a 24 lecture course of background material, and
  2. have as a minimal counterexample a case which cannot (feasibly) be checked without the use of a computer.

I believe conditions 1. and 2. make my question specific enough to have in some sense a “right” (or at least a “not wrong”) answer; but I’d be happy to clarify if this is not the case. I suppose I’m expecting an answer to come from number theory, but can see that areas like graph theory, combinatorics more generally and set theory could potentially offer suitable answers.


I’ll translate an entry in the blog Gaussianos (“Gaussians”) about Polya’s conjecture, titled:


We’ll say a number is of even kind if in its prime factorization, an even number of primes appear. For example $6 = 2\cdot 3$ is a number of even kind. And we’ll say a number is of odd kind if the number of primes in its factorization is odd. For example, $18 = 2·3·3$ is of odd kind. ($1$ is considered of even kind).

Let $n$ be any natural number. We’ll consider the following numbers:

  1. $E(n) =$ number of positive integers less or equal to $n$ that are of even kind.
  2. $O(n) =$ number of positive integers less or equal to $n$ that are of odd kind.

Let’s consider $n=7$. In this case $O(7) = 4$ (number 2, 3, 5 and 7 itself) and $E(7) = 3$ ( 1, 4 and 6). So $O(7) >E(7)$.

For $n = 6$: $O(6) = 3$ and $E(6) = 3$. Thus $O(6) = E(6)$.

In 1919 George Polya proposed the following result, know as Polya’s Conjecture:

For all $n > 2$, $O(n)$ is greater than or equal to $E(n)$.

Polya had checked this for $n < 1500$. In the following years this was tested up to $n=1000000$, which is a reason why the conjecture might be thought to be true. But that is wrong.

In 1962, Lehman found an explicit counterexample: for $n = 906180359$, we have $O(n) = E(n) – 1$, so:

$$O(906180359) < E(906180359).$$

By an exhaustive search, the smallest counterexample is $n = 906150257$, found by Tanaka in 1980.

Thus Polya’s Conjecture is false.

What do we learn from this? Well, it is simple: unfortunately in mathematics we cannot trust intuition or what happens for a finite number of cases, no matter how large the number is. Until the result is proved for the general case, we have no certainty that it is true.

Source : Link , Question Author : Community , Answer Author : Community

Leave a Comment