√c+√c+√c+⋯\sqrt{c+\sqrt{c+\sqrt{c+\cdots}}}, or the limit of the sequence xn+1=√c+xnx_{n+1} = \sqrt{c+x_n}

(Fitzpatrick Advanced Calculus 2e, Sec. 2.4 #12)

For c>0, consider the quadratic equation
x2xc=0,x>0.

Define the sequence {xn} recursively by fixing |x1|<c and then, if n is an index for which xn has been defined, defining

xn+1=c+xn

Prove that the sequence {xn} converges monotonically to the solution of the above equation.

Note: The answers below might assume x1>0, but they still work, as we have x3>0.


This is being repurposed in an effort to cut down on duplicates, see here: Coping with abstract duplicate questions.

and here: List of abstract duplicates.

Answer

Assuming that you know that a monotone, bounded sequence converges, you want to do two things. First, show that \langle x_n:n\in\mathbb{Z}^+\rangle is monotone and bounded, and then show that its limit is the positive root of x^2-x-c=0.

If c=x_1=1, x_2=\sqrt2>x_1, while if c=1 and x_1=2, x_2=\sqrt3<x_1, so if the sequence is monotonic, the direction in which it’s monotonic must depend on c and x_1. A good first step would be to try to figure out how this dependence works.

The positive root of the quadratic is \frac12(1+\sqrt{1+4c}), which I’ll denote by r. If x_n\to r, as claimed, and does so monotonically, it must be the case that the sequence increases monotonically if x_1<r and decreases monotonically if x_1>r. In the examples in the last paragraph, r=\frac12(1+\sqrt5)\approx 1.618, so they behave as predicted.

This suggests that your first step should be to show that if x_n<r, then x_n<x_{n+1}<r, while if x_n>r, x_n>x_{n+1}>r; that would be enough to show that \langle x_n:n\in\mathbb{Z}^+\rangle is both monotone and bounded and hence that it has a limit.

Suppose that 0\le x_n<r; you can easily check that x_n^2-x_n-c<0, i.e., that x_n^2<x_n+c. On the other hand, x_{n+1}^2=c+x_n, so x_{n+1}^2>x_n^2, and therefore x_{n+1}>x_n. Is it possible that x_{n+1}\ge r? That would require that x_{n+1}^2-x_{n+1}-c\ge 0 (why?) and hence that x_{n+1}^2\ge x_{n+1}+c>x_n+c=x_{n+1}^2\;, which is clearly impossible. Thus, if 0\le x_n<r, we must have x_n<x_{n+1}<r, as desired. I leave the case x_n>r to you.

Once this is done, you still have to show that the limit of the sequence really is r. Let f(x)=\sqrt{c+x}; clearly f is continuous, so if the sequence converges to L, we have L=\lim_{n\to\infty}x_n=\lim_{n\to\infty}x_{n+1}=\lim_{n\to\infty}f(x_n)=f(L)\;, and from there it’s trivial to check that L=r.

Added: Note that although the problem gave us x_1>0, this isn’t actually necessary: all that’s needed is that x_1\ge -c, so that x_2 is defined, since x_2=\sqrt{c+x_1}\ge 0 automatically.

Attribution
Source : Link , Question Author : cnuulhu , Answer Author : Brian M. Scott

Leave a Comment