13

I was told this problem a while ago, and recently someone explained the answer to me, which I didn't understand; could someone please explain in layman's terms (ish)?

You have a die with $n$ sides. Each side is numbered - uniquely - from $1$ to $n$, and has an equal probability of landing on top as the other sides (i.e. a fair die). For large $n$ (I was given it with $n = 1,000,000$), on average how many rolls does it take to achieve a cumulative score of $n$ (or greater)? That is, when you roll it, you add the result to your total score, then keep rolling and adding, and you stop when your score exceeds or is equal to $n$.

The cool thing about this problem: apparently, the answer is $e$. I would like to know exactly how this is derived.

Marcus M
  • 11,229
Bluefire
  • 1,668

4 Answers4

10

If we divide every roll by $n$, rolling the die and dividing by $n$ approximates the uniform distribution on $[0,1]$ for arbitrarily large $n$. We then are looking for the expected number of samples from a uniform distribution required to get a sum above $1$.

For any integer $k \in \mathbb{N}$, Let $X_1, \ldots , X_k, \ldots$ be the random variables in question. Then $$ \mathbb{P}[X_1 + \ldots + X_k \leq 1] = \frac{1}{k!} $$ as $\frac{1}{k!}$ is the volume of the $k$-dimensional simplex defined by $X_1 + \ldots + X_k \leq 1$. Now, let $p_k$ be the probability that it takes exactly $k$ rolls to get above $1$; this is equal to $$ p_k = \mathbb{P}[X_1 + \ldots + X_{k-1} \leq 1] - \mathbb{P}[X_1 + \ldots + X_{k} \leq 1] = \frac{1}{(k-1)!} - \frac{1}{k!},$$ as we need the first $k-1$ samples to sum below $1$, and need the first $k$ to sum above $1$. Thus, the expected number of rolls required is \begin{align} \mathbb{E}[\text{Number of Rolls Required}] &= \sum\limits_{k = 1}^\infty k\cdot p_k \\ &= \sum\limits_{k = 1}^\infty k \left( \frac{1}{(k-1)!} - \frac{1}{k!}\right) \\ &= \sum\limits_{k = 2}^\infty k \left( \frac{1}{(k-1)!} - \frac{1}{k!}\right) &\text{ as the }k=1\text{ case contributes }0 \\ &= \sum\limits_{k = 2}^\infty \frac{1}{(k - 2)!}\\ &= e. \end{align}

Marcus M
  • 11,229
3

No real answer, but a way to approach the problem and also to find solutions for small $n$.

Let $\mu_{k}$ denote the expectation of the number of rolls needed to arrive at a score of at least $k$.

This for $k\in\mathbb{Z}$.

Then:

  • $\mu_{k}=0$ if $k\leq0$

  • $\mu_{k}=\frac{1}{n}\left(1+\mu_{k-1}\right)+\frac{1}{n}\left(1+\mu_{k-2}\right)+\cdots+\frac{1}{n}\left(1+\mu_{k-n}\right)=1+\frac{1}{n}\sum_{i=1}^{n}\mu_{k-i}$ if $k>0$.

To be found is $\mu_n$ and we have a recurrence relation.

Edit:

Based on this it can be shown that $\mu_k=(1+\frac1{n})^{k-1}$ for $0<k\leq n$, hence $\mu_n=(1+\frac1{n})^{n-1}\rightarrow e$. See the comments on this answer for that.

Credit for that goes to Byron, and Marcus spared me some work too.

drhab
  • 151,093
  • 4
    If you push this through, you will find that $\mu_k=(1+1/n)^{(k-1)}$ for $0<k\leq n$. –  Aug 16 '15 at 20:21
  • 1
    @ByronSchmuland Apparantly it has been pushed through allready. By you, I mean :). I was too lazy to try. I suspect that induction can now be applied to prove this? – drhab Aug 16 '15 at 20:28
  • 3
    @drhab, There's actually no induction necessary; since $(1 + 1/n)^{(k-1)}$ satisfies the same starting condition and recurrence relation as your $\mu_k$ sequence, the two must be equal. This means that $\mu_n = (1 + 1/n)^{n-1} \to e$ as $n \to \infty$. – Marcus M Aug 16 '15 at 20:35
  • 2
    @MarcusM Well, my approach was a fruitful one then. The credit goes to Byron. – drhab Aug 16 '15 at 20:40
1

I hope that this is valid (my memory of math is faded from 20 years ago):

Let $f(x)$ be the average number of uniformly distributed reals between $0$ and $1$ to add up to at least $x$ where $0 \leq x \leq 1$. Then the problem is reduced to finding $f(1)$.

Let's try to determine $f(x)$. Since the distribution is uniform, there is $1-x$ chance to hit it on the first try. To calculate the average for the case where we don't hit it on the first try, we must integrate $\int_{0}^{x}(1+f(x-y))dy$. In the integral, $1$ is the one try we just consumed and $f(x-y)$ is the average number of tries to cover the remaining $x-y$

So, taken together, we can determine that

$$ f(x) = (1-x) + \int_{0}^{x}(1+f(x-y))dy $$

By substituting $z = x-y$, we can simplify the right-hand side:

$$f(x) = 1 + \int_{0}^{x}f(z)dz$$

Differentiating both sides gives us $f'(x) = f(x)$ and with a boundary condition of $f(0) = 1$, we find that $$f(x) = e^x$$

So it takes an average of $e^x$ tries to add up to any $x$ between $0$ and $1$ inclusive. $e^1 = e$, so it takes an average of $e$ tries to add to $1$.

Misha
  • 1,092
0

I'll be heuristic here,

The average role of a n-sided die is equal to,

$$\mu={{\sum_{k=1}^n k} \over n}={{n+1} \over 2}$$

How many times do you have to role the die? Each role is independent, so the average value at role $t$ is,

$$S=\mu \cdot t$$

So the average number of roles $t$ needed to have $S \ge n$ is given by,

$$t \ge {n \over {\mu}}={{2 n } \over {n+1}}$$

This becomes 2 in the limit rather than e so I'm guessing that there is an assumption here that is different from what is required.

Zach466920
  • 8,341
  • 1
    Your intuition is correct that it will take approximately $2x/(n+1)$ rolls to reach level $x$, for large $x$. In this problem, though, level $n$ is not very large and the asymptotics don't kick in yet. –  Aug 16 '15 at 21:05