Another exercise from Bartle/Sherbert Introduction to Real Analysis book (this one is exercise 9.4.14):
Use the Lagrange form of the remainder to justify the general Binomial Expansion $$(1+x)^{m}=\sum_{n=0}^{\infty}\binom{m}{n}x^{n}\quad \mathrm{for}\ 0\le x<1$$
Note: $m$ in an arbitrary real number.
My take: To prove the statement, one should show that Taylor series coefficients around $x=0$ are indeed $\frac{f^{(n)}(0)}{n!}=\binom{m}{n}$, that is rather obvious, and then also that the limit of the Lagrange form of the remainder: $$R_{n}(x)=\binom{m}{n+1}(1+c)^{m-(n+1)}x^{n+1}$$ is $0$ when $n\rightarrow\infty$ (of course, it is necessary to show this only for $0\le x<1$ and $0\le c<1$).
I have trouble with this later part. Here, $\frac{x}{1+c}<1$, thus $(\frac{x}{1+c})^{n+1}\rightarrow 0<$ when $n\rightarrow\infty$. Also, $(1+c)^{m}$ won't affect the limit if it's zero. But I'm not sure what to do about $\binom{m}{n+1}$ part...