# Numerical phenomenon. Who can explain?

I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test.

While I could have done something really boring like for(i < 10000000) { j = 2 * i }, I ended up having the program start with $$11$$, and then for a million steps choose a random real number $$rr$$ in the interval $$[0,R][0,R]$$ (uniformly distributed) and multiply the result by $$rr$$ at each step.

• When $$R=2R = 2$$, it converged to $$00$$.
• When $$R=3R = 3$$, it exploded to infinity.

So of course, the question anyone with a modicum of curiosity would ask: for what $$RR$$ do we have the transition. And then, I tried the first number between $$22$$ and $$33$$ that we would all think of, Euler’s number $$ee$$, and sure enough, this conjecture was right. Would love to see a proof of this.

Now when I should be working, I’m instead wondering about the behavior of this script.

Ironically, rather than wasting my CPUs time, I’m wasting my own time. But it’s a beautiful phenomenon. I don’t regret it. $$¨⌣\ddot\smile$$

EDIT: I saw that you solved it yourself. Congrats! I’m posting this anyway because I was most of the way through typing it when your answer hit.

Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.

Let $$Xi∼Uniform(0,r)X_i \sim \operatorname{Uniform}(0, r)$$, and let $$Yn=∏ni=1XiY_n = \prod_{i=1}^{n} X_i$$. Note that $$log(Yn)=∑ni=1log(Xi)\log(Y_n) = \sum_{i=1}^n \log(X_i)$$. The eventual emergence of $$ee$$ as important is already somewhat clear, even though we haven’t really done anything yet.

The more useful formulation here is that $$log(Yn)n=1n∑log(Xi)\frac{\log(Y_n)}{n} = \frac 1 n \sum \log(X_i)$$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $$E[log(Xi)]\mathbb E[\log(X_i)]$$. We have
$$Elog(Xi)=∫r0log(x)⋅1rdx=1r[xlog(x)−x]|r0=log(r)−1.\mathbb E \log(X_i) = \int_0^r \log(x) \cdot \frac 1 r \, \textrm d x = \frac 1 r [x \log(x) - x] \bigg|_0^r = \log(r) - 1.$$

If $$r, then $$log(Yn)/n→c<0\log(Y_n) / n \to c < 0$$, which implies that $$log(Yn)→−∞\log(Y_n) \to -\infty$$, hence $$Yn→0Y_n \to 0$$. Similarly, if $$r>er > e$$, then $$log(Yn)/n→c>0\log(Y_n) / n \to c > 0$$, whence $$Yn→∞Y_n \to \infty$$. The fun case is: what happens when $$r=er = e$$?