2

Bisection method is commonly said to be linearly convergent, but as far as I can tell, it does not neatly fit into the definition. e.g. a method is convergent with order $\alpha$ if

$|x_{n+1}-x^*|\le\lambda|x_n-x^*|^\alpha, \hspace{0.1in} \forall n > N$

Alas, I believe bisection has the pesky tendency that the error will increase at any arbitrary step. I'm not sure how to write it succinctly, but I believe that that approximation will keep stepping towards the solution in steps of $[b_n - a_n]/2$ where $[a_n,b_n]$ is the current interval. There will be periods where the error decreases with every step, until it overshoots the solution and increases at that step. Subsequently it will decrease until it overshoots it again. Of course, each amount it overshoots will be smaller and smaller each time, but still there will be arbitrary large steps $n$ where the error actually increases.

Is this incorrect? If so, can you provide a proof? If this is correct, than what can we say about the convergence? I find it unfortunate because it certainly seems like bisection is the simplest way to show a clear convergence, that at least at face value seems to be linear, but seems to not quite fit into the definition.

****edit*****

I am content with Ian's answer, that it converges linearly if we consider a different definition of order of convergence. However, my question now would be, is there an example of a method that satisfies the definition:

$lim_{n\rightarrow \infty} \frac{|x_{n+1}-x^*|}{|x_{n}-x^*|^\alpha} > 0?$

Fractal20
  • 1,479
  • 1
  • 13
  • 30
  • see here https://math.stackexchange.com/questions/248616/convergence-of-bisection-method – Dr. Sonnhard Graubner Sep 28 '17 at 16:56
  • @Dr.SonnhardGraubner I have looked at that. I believe it is not sufficient because again, there will indefinitely be periodic increases in the error from one step to another, so the limit definition does not work at least not as stated there. – Fractal20 Sep 28 '17 at 17:16

1 Answers1

4

The more practical version of the definition of linear convergence is that $|e_n| \leq \epsilon_n$ where $\epsilon_n$ is a sequence satisfying

$$\lim_{n \to \infty} \frac{\epsilon_{n+1}}{\epsilon_n} = \mu<1.$$

The "simple" version is easier to think about, but asserts some monotonicity of convergence that you just usually don't have. This extension is discussed at https://en.wikipedia.org/wiki/Rate_of_convergence#Extended_definition

In bisection this $\epsilon_n$ can be quite explicitly furnished as $(b-a)2^{-n}$. The cases when $|e_n|$ is a lot less than your $\epsilon_n$ should be thought of as being more like "happy coincidences". For example, if you use bisection to solve $x^2-2=0$ with the initial bracket $[1,2]$, you have one such "happy coincidence" at the first step, when the error of about $0.086$ is compared with the bound of $0.5$.

Ian
  • 101,645
  • But transient exactly means that it will phase out, but in this case it seems like it will never phase out. There will be these periodic bumps in error indefinitely. No? – Fractal20 Sep 28 '17 at 17:11
  • Ah, lim sup! I think that would do it for me. But I've never seen the definition of convergence written in terms of lim sup. Is it often defined this way? Could you point to a textbook that does? – Fractal20 Sep 28 '17 at 17:14
  • @Fractal20 $\limsup$ is just saying that it can't bump up occasionally, it has to eventually lie below $\mu+\epsilon$ even if it doesn't actually converge. I think you may actually be right that for bisection it does not do this. In fact I think it may be misleading to talk about $\frac{|e_{n+1}|}{|e_n|}$ and you may instead need to look at $\frac{\epsilon_{n+1}}{\epsilon_n}$ where $\epsilon_n$ is some well-behaved bound on $|e_n|$, in this case $\epsilon_n=(b-a) 2^{-n}$. – Ian Sep 28 '17 at 17:15
  • @Fractal20 The thing is, as a practical matter, as I said in the end of my answer, monotonic convergence, or even asymptotically monotonic convergence, tends to be hopeless. We tend to have fluke iterations where things are better than they "should" be, no matter what exactly we are doing. (There are some exceptions, for example Newton's method for $\sqrt{2}$ converges asymptotically monotonically, in fact it converges monotonically if you skip the first iterate.) – Ian Sep 28 '17 at 17:17
  • but I believe that in many cases it can be proven. i.e. Newton's in a sufficiently close neighborhood to the point. Or is that not true? Why would a formal limit definition proliferate if no actual examples adhere to it? – Fractal20 Sep 28 '17 at 17:19
  • @Fractal20 It's actually quite unusual, the monotonicity in Newton's method for $\sqrt{2}$ follows from some monotonicity+convexity behavior that generically you don't have. I think you will see similar behavior to bisection in Newton's method if you look at a problem with $f'(x^) \neq 0,f''(x^)=0$. By the way, cf. https://en.wikipedia.org/wiki/Rate_of_convergence#Extended_definition – Ian Sep 28 '17 at 17:20
  • Okay, I am content with your definition. If we use the upper bound on the error at each step then I think linear convergence nicely follows for bisection. But my question now would be, what method actually follows the naive/simple definition of convergence I gave in my question initially? – Fractal20 Sep 28 '17 at 17:45
  • @Fractal20 I can't think of anything that will always do that. Certain nice problems with nice methods will do that, but I can't think of a method that will always do that even when it is convergent. – Ian Sep 28 '17 at 17:47