Consider the sequence defined as

x_1 = 1

x_{n+1} = \sin x_n

I think I was able to show that the sequence \sqrt{n} x_{n} converges to \sqrt{3} by a tedious elementary method which I wasn’t too happy about.

(I think I did this by showing that \sqrt{\frac{3}{n+1}} < x_{n} < \sqrt{\frac{3}{n}}, don't remember exactly)

This looks like it should be a standard problem.

Does anyone know a simple (and preferably elementary) proof for the fact that the sequence \sqrt{n}x_{n} converges to \sqrt{3}?

**Answer**

Before getting into the details, let me say: The ideas I'm talking about, including this exact example, can be found in chapter 8 of *Asymptotic Methods in Analysis* (second edition), by N. G. de Bruijn. This is a really superb book, and I recommend it to anyone who wants to learn how to approximate quantities in "calculus-like" settings. (If you want to do approximation in combinatorial settings, I recommend Chapter 9 of *Concrete Mathematics*.)

Also, this isn't just about \sin. Let f be a function with f(0)=0 and 0 \leq f(u) < u for u in (0,c] then the sequence x_n:=f(f(f(\cdots f(c)\cdots) approaches 0. If f(u)=u-a u^{k+1} + O(u^{k+2}) (with a>0) then x_n \approx \alpha n^{-1/k} and you can prove that by the same methods here.

Having said that, the answer to your question. On [0,1], we have

\sin x=x-x^3/6+O(x^5).

Setting y_n=1/x_n^2, we have

1/x_{n+1}^2 = x_n^{-2} \left(1-x_n^2/6+O(x_n^4) \right)^{-2} = 1/x_n^2 + 1/3 + O(x_n^2)

so

y_{n+1} = y_n + 1/3 + O(y_n^{-1}).

We see that

y_n = \frac{n}{3} + O\left( \sum_{k=1}^n y_k^{-1} \right)

and

\frac{1}{n}y_n = \frac{1}{3} + \frac{1}{n} O\left( \sum_{k=1}^n y_k^{-1} \right)

Since we already know that x_n \to 0, we know that y_n^{-1} \to 0, so the average goes to zero and we get \lim_{n \to \infty} y_n/n=1/3. Transforming back to \sqrt{n} x_n now follows by the continuity of 1/\sqrt{t}.

**Attribution***Source : Link , Question Author : Aryabhata , Answer Author : David E Speyer*