If there is a constant probability $p$ of winning a round of a game such that each game is independent of every other, what is the probability $P$ of losing $n$ consecutive rounds out of a total of $k$?
In the simple case where $k$=$n$, $P=(1-p)^n$. However, I was not able to work out a formula in any case where $k>n,k\in \mathbb{N}$.
I've tried several approaches, including:
- As events are independent, the streak is equally likely to occur at any point. There are $k-n$ points where the streak could start, and the probability of a streak occurring at any point is $(1-p)^n$, so $P=(k-n)(1-p)^n$. However, I believe this double counts streaks of length $n+1$ as two $n$ length streaks, and in some cases this may be greater than $1$ if $p$ is quite low and $k-n$ is high.
- A binomial distribution can model the probability that there are $x$ losses, if $x\geq n$. We can sum over all possible values of $x$ and multiply the probability of $x$ total losses by the probability of $n$ consecutive losses given $x$ total losses using permutations and combinations. However, I could not work out how to correctly calculate the probability of $n$ or more consecutive losses given $x$ total losses out of $k$ events.
Can anyone help me solve this deceptively simple problem? I am trying to perform risk analysis, where $n$ is the number of successive investments I can afford to lose out of a total of $k$. Currently I am estimating these values using a primitive computer simulation, but it would be much better if I could calculate (or at least approximate) the true probability in a reasonable amount of time.
Update: this answer is a computationally viable approach to solving an almost identical problem relying on Markov chains which I've tested and it generalises wonderfully.