I am conducting an error analysis of some sensor measurements associated with uncertainties. I ended up with the equation $\mathcal{S}$ of the form below. I numerically simulated the equation and noticed that it is always bounded by $\pm 1$. Nonetheless, I need to prove that mathematically. So my question is why the following inequality holds for any $\theta$ and $\epsilon$:
$$ \left| \mathcal{S} = \frac{\sin(\theta+\epsilon)-\sin(\theta)}{\epsilon} \right| \leq 1. $$
Thanks.