-3

I am conducting an error analysis of some sensor measurements associated with uncertainties. I ended up with the equation $\mathcal{S}$ of the form below. I numerically simulated the equation and noticed that it is always bounded by $\pm 1$. Nonetheless, I need to prove that mathematically. So my question is why the following inequality holds for any $\theta$ and $\epsilon$:

$$ \left| \mathcal{S} = \frac{\sin(\theta+\epsilon)-\sin(\theta)}{\epsilon} \right| \leq 1. $$

Thanks.

AEW
  • 171

1 Answers1

0

Since the function sinus is of classe C^\infty, then by the Lagrange's theorem there exists \lambda belongs to (\theta, \theta + \epsilon) such that

sin(\theta + \epsilon) - \sin(\theta) = \epsilon \sin(\lambda)

and by boundedness of sinus, you obtain the result.

Isaac
  • 11