Bisection method is commonly said to be linearly convergent, but as far as I can tell, it does not neatly fit into the definition. e.g. a method is convergent with order $\alpha$ if
$|x_{n+1}-x^*|\le\lambda|x_n-x^*|^\alpha, \hspace{0.1in} \forall n > N$
Alas, I believe bisection has the pesky tendency that the error will increase at any arbitrary step. I'm not sure how to write it succinctly, but I believe that that approximation will keep stepping towards the solution in steps of $[b_n - a_n]/2$ where $[a_n,b_n]$ is the current interval. There will be periods where the error decreases with every step, until it overshoots the solution and increases at that step. Subsequently it will decrease until it overshoots it again. Of course, each amount it overshoots will be smaller and smaller each time, but still there will be arbitrary large steps $n$ where the error actually increases.
Is this incorrect? If so, can you provide a proof? If this is correct, than what can we say about the convergence? I find it unfortunate because it certainly seems like bisection is the simplest way to show a clear convergence, that at least at face value seems to be linear, but seems to not quite fit into the definition.
****edit*****
I am content with Ian's answer, that it converges linearly if we consider a different definition of order of convergence. However, my question now would be, is there an example of a method that satisfies the definition:
$lim_{n\rightarrow \infty} \frac{|x_{n+1}-x^*|}{|x_{n}-x^*|^\alpha} > 0?$