4

What is the probability that $THTH$ occurs before $HTHH$ in an infinite sequence of coin flips?

The expected number of flips until you first see $THTH$ is $6$, while the expected number until you first see $HTHH$ is $10$. Intuitively, I would guess that the probability that $THTH$ occurs before $HTHH$ is $3/4$. Is there a formal argument to compute this probability?

Probabilistic model: We denote by $(X_{n})_{n \geq 1}$ the random variable of coin flips, taking values in $\{H,T\}^{\mathbb{N}}$. We let $t_{THTH}$ be the first time that $THTH$ occurs in the sequence. We have $$ \mathbb{E}[t_{H}] = \frac{1}{2} \mathbb{E}[t_{H} | X_{1} = H] + \frac{1}{2} \mathbb{E}[t_{H} | X_{1} = T] = \frac{1}{2} + \frac{1}{2} (\mathbb{E}[t_{H}] + 1 ). $$

$$ \mathbb{E}[t_{TH}] = \frac{1}{2} \mathbb{E}[t_{TH} | X_{1} = H] + \frac{1}{2} \mathbb{E}[t_{TH} | X_{1} = T] $$

  • The expected time to get $THTH$ is 20, not 6. – Michael Dec 09 '19 at 11:13
  • Thank you, I will check again. How could you compute $20$ so quick? Do you have an efficient way to do this? I did the following: – user893458 Dec 09 '19 at 11:30
  • We have $E[H] = \frac{1}{2} + \frac{1}{2} ( 1 + E[H] )$, hence $E[H] = 2$. Then $E[TH] = \frac{1}{2} (1 + E[H] )+ \frac{1}{2} ( 1 + E[TH] )$, so $E[TH] = 4$. It follows that $E[HTH] = \frac{1}{2} (1 + E[TH] )+ \frac{1}{2} ( 1 + E[HTH] )$, i.e. $E[HTH] = 6$. Finally, $E[THTH] = \frac{1}{2} (1 + E[HTH] )+ \frac{1}{2} ( 1 + E[THTH] )$ and we conclude $E[THTH] = 8$. – user893458 Dec 09 '19 at 11:30
  • That reasoning is not correct. It looks like you want to define a DTMC with states ${0, T, TH, THT, THTH}\leftrightarrow {0, 1, 2, 3, 4, 5}$ and defining $X_i$ as teh random time to get to state 5, given we start in state $i$, gives $E[X_5]=0$ and $$E[X_i] = 1 + \sum_{j=0}^5 P_{ij}E[X_j] \quad \forall i \in {0, 1, 2, 3, 4}$$ – Michael Dec 09 '19 at 11:40
  • To be honest, I was following the approach of https://math.stackexchange.com/questions/521130/expected-value-of-flips-until-ht-consecutively/838575#838575. – user893458 Dec 09 '19 at 11:40
  • At the link you give, I have added a comment on the incorrect reasoning given there, which (accidentally) produces the correct answer for the special case of only 2 flips. While there is indeed a simple formula for the expected time to see any particular combination, and for the question you ask, the more basic (longer) calculation uses a DTMC, where you want to compute the expected remaining time given your current state. – Michael Dec 09 '19 at 11:57
  • Many thanks for pointing out to me that error in this answer. I would like to rigorously model a Markov chain and strictly prove these results. See my updated question. Can you show me how to do this? – user893458 Dec 09 '19 at 12:10
  • Why so much focus on expectations in question and comments? You want to find a probability, right? – drhab Dec 09 '19 at 12:12
  • Yes, true. But now, as I know that my initial thoughts were wrong, I would like to correct them and understand how it really works. – user893458 Dec 09 '19 at 12:13
  • @drhab : The more basic question is to compute the expected time to THTH, and the asker has misconceptions about that (an incorrect value of 6 is claimed). It may be better to focus on the simpler question the asker needs to understand than the more complicated one the asker is asking. Nevertheless I agree that for the second question, we want probabilities rather than expectations. – Michael Dec 09 '19 at 12:13
  • @MJD Thank you for the link. This especially because it seems to confirm my answer. – drhab Dec 09 '19 at 12:17
  • @user893458 : FYI: Here are general notes on expected travel times in Markov chains. http://ee.usc.edu/stochastic-nets/docs/markov-chains-travel-times.pdf – Michael Dec 09 '19 at 12:25

2 Answers2

2

Discern $p,p_{T},p_{H},p_{TH},p_{HT},p_{THT},p_{HTH}\in\left[0,1\right]$ where $p$ denotes the probability that $THTH$ occurs before $HTHH$ and e.g. $p_{TH}$ denotes the probability that $THTH$ occurs before $HTHH$ under the extra condition that we start with $TH$.

Then we have the following equalities:

  • $2p=p_{T}+p_{H}$
  • $2p_{T}=p_{T}+p_{TH}$
  • $2p_{H}=p_{HT}+p_{H}$
  • $2p_{TH}=p_{THT}+p_{H}$
  • $2p_{HT}=p_{HTH}+p_{T}$
  • $2p_{THT}=1+p_{T}$
  • $2p_{HTH}=p_{THT}$

(I avoided fractions, but things might become more clear if you divide both sides by $2$)

I find the solutions:

  • $p_{THT}=\frac67$
  • $p_{HTH}=\frac37$
  • $p_{TH}=p_T=\frac57$
  • $p_{HT}=p_H=\frac47$

and finally:$$p=\frac9{14}$$

Check me on mistakes, though.

drhab
  • 151,093
1

We have states $\{H,T,HT,TH,THT,HTH,THTH,HTHH\}$ which are the prefixes of your strings. We can write down a Markov transition matrix for these states:

$$A = \frac{1}{2} \begin{pmatrix} 1 & 0 & 1 & 0 & 0 & 0 & 0 & 0\\ 0 & 1 & 0 & 1 & 0 & 0 & 0 & 0\\ 0 & 1 & 0 & 0 & 0 & 1 & 0 & 0\\ 1 & 0 & 0 & 0 & 1 & 0 & 0 & 0\\ 0 & 1 & 0 & 0 & 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 0 & 1 & 0 & 0 & 1\\ 0 & 0 & 0 & 0 & 0 & 0 & 2 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 2 \end{pmatrix}$$

This is an absorbing Markov chain already in canonical form \begin{pmatrix}Q&R\\0&I\end{pmatrix} thus we can find its fundamental matrix $N$ with

$$N = (I - Q)^{-1}$$ $$N = \frac{1}{7} \begin{pmatrix} 24 & 20 & 12 & 10 & 8 & 6\\ 16 & 32 & 8 & 16 & 10 & 4\\ 10 & 20 & 12 & 10 & 8 & 6\\ 16 & 18 & 8 & 16 & 10 & 4\\ 8 & 16 & 4 & 8 & 12 & 2\\ 4 & 8 & 2 & 4 & 6 & 8 \end{pmatrix}.$$

Finally we can get the probabilities of our absorbing states given an initial probability vector of non-absorbing states $\vec p$with ${\vec p}^TNR$. Assuming we threw one coin already our initial state is $[1/2, 1/2, 0, 0, 0, 0]^T$, and thus our final answer is that we end up with $THTH$ $9/14$th of the time and $HTHH$ $5/14$th of the time.

orlp
  • 10,508