From what I understand, for Newton's Method to converge in the equation
$$g(x_n) = x_{n-1} - \frac{f(x_{n-1})}{f'(x_{n-1})}$$
(1) $g(x)$ and $g'(x)$ must be continuous around the root.
(2) The initial guess must fall within this interval.
(3) The condition $|g'(x)| \leq 1$ for all $x$ in the interval around the root must be true.
This answer Showing that Newton's method converges confirms my idea. I also found When is Newton's Method guaranteed to converge to a good solution (non-linear system)? which has a good explanation of Newton's Method, but not really what I was looking for.
I just need to know if the above mentioned conditions guarantee convergence?