I’ve been trying to solve the following problem:

Suppose that f and f’ are continuous functions on \mathbb{R}, and that \displaystyle\lim_{x\to\infty}f(x) and \displaystyle\lim_{x\to\infty}f'(x) exist. Show that \displaystyle\lim_{x\to\infty}f'(x) = 0.

I’m not entirely sure what to do. Since there’s not a lot of information given, I guess there isn’t very much one can do. I tried using the definition of the derivative and showing that it went to 0 as x went to \infty but that didn’t really work out. Now I’m thinking I should assume \displaystyle\lim_{x\to\infty}f'(x) = L \neq 0 and try to get a contradiction, but I’m not sure where the contradiction would come from.

Could somebody point me in the right direction (e.g. a certain theorem or property I have to use?) Thanks

**Answer**

Apply a L’Hospital slick trick: \, if \rm\ f + f\,’\!\to L\ as \rm\ x\to\infty\ then \rm\ f\to L,\ f\,’\!\to 0,\, since

\rm \lim_{x\to\infty}\ f(x)\ =\ \lim_{x\to\infty}\frac{e^x\ f(x)}{e^x}\ =\ \lim_{x\to\infty}\frac{e^x\ (f(x)+f\,'(x))}{e^x}\ =\ \lim_{x\to\infty}\, (f(x)+f'(x))\qquad

This application of L’Hôpital’s rule achieved some notoriety because the problem appeared in Hardy’s classic calculus texbook *A Course of Pure Mathematics*, but with a less elegant solution. For example, see Landau; Jones: A Hardy Old Problem, Math. Magazine 56 (1983) 230-232.

**Attribution***Source : Link , Question Author : saurs , Answer Author : Bill Dubuque*