0

I understand that the probability of the sum of n uniform random variables less than 1 is 1/n!, but why the expected sum of n uniform random variables is the summation of 1/n!? How to derive this intuitively?

Thanks, L

1 Answers1

0

Trying some extended mind-reading skills. I hope I understood the question correctly.

Given the sum of n independent and identically distributed U(0, 1) random variables: $ X=\sum _{k=1}^{n}U_{k}$.

Now $P(X<1)=\frac{1}{n!}$.

To show: the expected number of distributions, n, such that $P(X>1)$ is $\sum_{n=0}^{\infty}(\frac{1}{n!}) =e$

How to show this:

the probability that the sum of n variates is bigger than 1 while the sum of n-1 variates is smaller than one is the same as the chance that the sum of n variates is bigger than 1 minus the chance that n-1 variates are bigger than 1:

$(1-\frac{1}{n!})-(1-\frac{1}{(n-1)!})=\frac{1}{n(n-2)!}$

So the desired expected value is $\sum_{n=1}^{\infty} n\cdot \frac{1}{n(n-2)!}=\sum_{n=0}^\infty \frac{1}{n!}=e$