0

In the book: "John Derbyshire prime obsession". I've read an example that if we take a sequence of i.i.d $X_i \sim \mathbb{U}[0,1]$ and if we sum them up in average we need $e$ numbers such that $\sum X_i \geq 1 $.

I thought about this issue and tried to prove it in a probabilistic way, but fail, with a general approach. I would really appreciate some ideas, how to handle this issue.

bof
  • 78,265

1 Answers1

2

Given the $X_i,$ define a random variable $N$ as the least $n$ such that $X_1+\cdots X_n\geq 1$. Then what you want to find is $E(N)$, the expected value of $N$.

Now, given any $n$ define $p_n=P(\sum_{i=1}^{n} X_i<1)=\frac{1}{n!}$ (why?)

So if, for $n\geq 0$, $Y_n$ is the random variable equal to $1$ if $\sum_{i=1}^{n} X_i<1$ and $0$ otherwise, then:

$$N=\sum_{n=0}^\infty Y_n$$

And $E[Y_n]=p_n$.

So: $$E[N]=\sum_{n=0}^{\infty} E[Y_n]=\sum_{n=0}^{\infty}\frac{1}{n!}=e$$


Discrete version, and $\left(1+\frac 1m\right)^m\to e$

Fix integer $m>1$. Let $X_1,X_2,\dots,X_m,X_{m+1}$ be independent uniform selections of values from $\{1,2,\dots,m\}$. Let $N$ be the smallest number $n$ such that $X_1+X_2+\cdots+X_n>m.$

For $0\leq j\leq m$, let $Y_j$ be $1$ if $X_1+\cdots+X_j<m+1$ and $0$ otherwise. Then:

$$N=\sum_{j=0}^{m} Y_j$$

So:

$$E(N)=\sum_{j=0}^{m} E(Y_j)$$

But $E(Y_j)=P(X_1+\cdots+ X_j<m+1)$ which, by the usual stars and bars argument, is: $$E(Y_j)=\frac{\binom{m}{j}}{m^j}$$

So:

$$E(N)=\sum_{j=0}^{m} \binom{m}{j}\frac{1}{m^j}=\left(1+\frac1m\right)^m$$

This shows a probabilistic reason that $\left(1+\frac 1m\right)^m\to e$.


A simpler case

A slightly easier but related case is to look at $X_1,X_2,\dots$ uniform in $[0,1]$ and define $N$ to be the first $n$ such that $X_n\leq X_{n-1}$.

Defining $Y_i$ for $i\geq 0$ as the condition $X_1<X_2<\cdots<X_i$. Then $$E(N)=\sum_{i=0}^{\infty} P(Y_i)$$

It is more obvious that $P(Y_i)=\frac{1}{i!}$, because there is zero probability that any $X_i$ are equal, and for any set $\{x_1,\dots,x_i\}$ of distinct numbers, there is only one sorted order out of $i!$ orders.

The discrete version is also easier. There, it is obvious that $P(Y_i)=\binom{m}{i}\frac{1}{m^i}$, because we can take any $i$-subset of $\{1,\dots,m\}$ and get exactly one sequence $x_1<x_2<\cdots<x_i$.

Thomas Andrews
  • 177,126
  • Concerning the step: $p_n=P(\sum_{i=1}^n X_i<1)=\frac{1}{n!}$ I found one explanation here: https://math.stackexchange.com/questions/1683558/probability-that-sum-of-independent-uniform-variables-is-less-than-1 – Koval Boris Nov 09 '17 at 20:47
  • 1
    Yeah, the "why" was really meant as a question for the OP, but a link is helpful. Basically, $p_n$ is the volume of a certain simplex. – Thomas Andrews Nov 09 '17 at 20:50