For all numerical methods I am aware of, you need both consistency and stability to get convergence.
Consistency is what usually motivates the method - something that makes intuitively sense and comes with some controllable error.
As an example, consider the Forward Euler scheme for integrating ODEs:
$$x'(t) = f\big(x(t), t\big) \Rightarrow \frac{x_{n+1} - x_n}{\Delta t} = f(x_n, t_n)$$
Looks reasonably, right? You approximate the derivative with first order Taylor and choose for evaluating the RHS the known value $x_n$. The error of this approximation is $\Delta t$, so for a "small enough" $\Delta t$, you are good to go, right?
Well unfortunately it is not that easy. The Forward Euler method is the most prominent example of a potentially unstable scheme. Intuitively speaking, stability is the property of a scheme to get to the right solution, even when making in every step an inevitable, small error. Stable schemes ensure that the tiny error you make in every step do not (for millions of steps) pile up to something that diverges / blows up.
In general, stability is the hard thing to prove (compared to consistency) and requires often sophisticated techniques from various branches of math.