Say we want to prove that
$$ \left(1 + \frac{x}{n} + o(1/n) \right)^{1/n} \xrightarrow{n \to \infty} e^x \,\,\text{ for fixed }x \geq 0.$$
The case for
$$ \left(1 + \frac{x}{n} \right)^{1/n} \to e^x $$
can use monotone convergence theorem, because the columns of the below array are non-decreasing. But if we put the $o(1/n)$ in there, we don't know anymore.
$$\begin{matrix} 1 & x & 0 && \dotsb \\ 1 & x & \frac{2 \cdot 1}{2^2}\frac{x^2}{2!} & 0 & \dotsb \\ 1 & x & \frac{3 \cdot 2}{3^2}\frac{x^2}{2!} & \frac{3 \cdot 2 \cdot 1}{3^3}\frac{x^3}{3!} & \dotsb \\ 1 & x & \frac{4 \cdot 3}{4^2}\frac{x^2}{2!} & \frac{4 \cdot 3 \cdot 2}{4^3}\frac{x^3}{3!} & \dotsb \end{matrix}$$
When we include the $o(1/n)$, and we don't know whether $o(1/n)$ is monotone or not, would the usual way be to prove convergence first, by taking a bound for $no(1/n)$, then use dominated convergence theorem, using that it's dominated by $e^{x + \text{bound for }n(o(1/n))}$?
Or is there a smoother, more generalizable path? Maybe some clever use of Taylor expansion or something?