0

I noticed that when $f'(x)$ tends to $+\infty$ as $x$ tends to $+\infty$, then $f(x)$ must tend to $+\infty$ as $x$ tends to $+\infty$ as well. I'm stuck at the proof though.

If you implement the mean value theorem for $f(x)$ in $(x,x+1)$ you'll get:

$f(x+1) - f(x)= f'(b)$ where $b>x$

now if u take limits for $x\rightarrow+\infty$, b must tend to $+\infty$ as well. So, $\lim_{x\to\infty}$ $f(x+1) - f(x)$ = $+\infty$, as $x$ tends to $+\infty$

Also, $f(x)$ must be increasing as $x$ tends to $+\infty$. Can somebody continue this or show me another way of proving this?

Ovi
  • 23,737
Plom
  • 661

1 Answers1

12

Since $f'(x)\to+\infty$ when $x\to+\infty$, there exists $x_0$ such that $f'(x)\geqslant1$ for every $x\geqslant x_0$. Thus, $f(x)\geqslant x-x_0+f(x_0)$ for every $x\geqslant x_0$. Since $x-x_0+f(x_0)\to+\infty$ when $x\to+\infty$, this implies that $f(x)\to+\infty$ when $x\to+\infty$.

Did
  • 279,727