standard uniform distribution
The standard uniform distribution, $X \sim U(0, 1)$, has a probability density function (PDF):
$$ f(x) = \begin{cases} 1 \quad \text{if } 0 < x < 1\\ 0 \quad \text{otherwise} \end{cases} $$
$X \sim U(0, 1)$ has
- mean $E(X) = \mu_X = 0.5$;
- variance $V(X) = \frac 1 {12}$; and,
- $P(a < X < b) = \int_a^b f(x)dx$
sum of two independent random variables, $X+Y$
Given two independent random variables, $X$ and $Y$, we have
- $E(X+Y) = E(X) + E(Y)$; and,
- $V(X+Y) = V(X) + V(Y)$.
Also if $X$ and $Y$ have PDFs $f$ and $g$, respectively, then the the PDF of $X + Y$ is the convolution of $f$ and $g$, which assuming $f$ and $g$ are supported only on $[0, \infty)$, is
$$ (f \ast g)(x) = \int_0^x f(u)g(x-u)du $$
and
$$P(a < X+Y < b) = \int_a^b (f \ast g)(x)dx$$
$n$th self-convolution, $f^n$
Let $f^n$ be the convolution of $f$ with itself $n$ times. Or more formally,
$$ f^n(x) = \begin{cases} \begin{align} f(x) \quad &\text{if } n = 1\\ (f \ast f^{n-1})(x) \quad &\text{if } n > 1 \end{align} \end{cases} $$
i.e.
- $f^1(x) = f(x)$
- $f^2(x) = (f \ast f)(x)$
- $f^3(x) = (f \ast f^2)(x)$
- etc.
sum of $n$ i.i.d. random variable, $Y_n = X_1 + \cdots + X_n$
Let $Y_n = \sum_{i=1}^n X_i$ for $n >= 1$, i.e. the sum $n$ random variables $X_1, ... X_n$.
If the $X_i$ are i.i.d. (independent and identically distributed) and each with PDF $f(x)$, we have
- $E(Y_n) = n E(X_1)$; and,
- $V(Y_n) = nV(X_1)$.
- $P(a < Y_n < b) = \int_a^b f^n(x)dx$
How do I compute $P(Y_n < 0.5)$ and $P(Y_n > 1)$ where $Y_n = X_1 + \cdots X_n$ where $X_i \sim U(0, 1)$?
We have $n$ i.i.d. random variables $X_i \sim U(0, 1)$ each with a PDF of
$$ f(x) = \begin{cases} 1 \quad \text{if } 0 < x < 1\\ 0 \quad \text{otherwise} \end{cases} $$
I need to compute
- $P(Y_n < 0.5) = \int_0^0.5 f^n(x)$; and,
- $P(Y_n > 1) = \int_1^\infty f^n(x)$
where, again, $Y_n = X_1 + \cdots X_n$ for $n >= 1$.
This involves computing the $n$th self-convolution of $f$ for arbitrary $n$. Is there a known formula of the $n$th self-convolution of $f$? Can I use some trick to compute said probabilities using the fact that $E(Y_n) = nE(X)$ and $V(Y_n) = nV(X)$ or the fact that the Central Limit Theorem tells us that $f^n$ tends towards the normal distribution as $n$ tends to $\infty$?
Ultimately, I need to compute
$$ \sum_{i=2}^\infty \Bigg(\bigg( \prod_{j=1}^{i-1} P(Y_j < 0.5) \bigg) P(Y_i > 1) \Bigg) $$
to $d$ digits past the decimal point. Comparing subsequent terms of the sum, say at $n$ and $n+1$, we have
$$ \begin{align} n\text{th term:} &\quad P(Y_1 < 0.5) \cdot P(Y_2 < 0.5) \cdots P(Y_{n-1} < 0.5) \cdot P(Y_n > 1)\\ (n+1)\text{th term:} &\quad P(Y_1 < 0.5) \cdot P(Y_2 < 0.5) \cdots P(Y_{n-1} < 0.5) \cdot P(Y_n < 0.5) \cdot P(Y_{n+1} < 1) \end{align} $$
factoring out the common factors, $C = P(Y_1 < 0.5) \cdot P(Y_2 < 0.5) \cdots P(Y_{n-1} < 1)$
$$ \begin{align} n\text{th term:} &\quad C \cdot P(Y_n > 1)\\ (n+1)\text{th term:} &\quad C \cdot P(Y_n < 0.5) \cdot P(Y_{n+1} < 1) \end{align} $$
How do they compare? I think $P(Y_n < 0.5) << P(Y_n > 1) < P(Y_{n+1} < 1)$ such that $P(Y_n > 1) > P(Y_n < 0.5) \cdot P(Y_{n+1} < 1)$, so my gut feeling is that the terms are decreasing substantially (proof needed). Assuming that the the terms are decreasing, there is a term, say the $n$th term whose value is less than $10^{-d}$ which doesn't contribute to the first $d$ digits of the sum. Once we hit that term, we can stop because this term and subsequent terms don't contribute to the first $d$ digits of the sum$^*$.
$^*$ actually, I think this is only the case if we prove that $i$th term $\geq 10 \cdot (i+1)$th term for $i \geq n$.
I computed $P(Y_n < 0.5)$ and $P(Y_n > 1)$ manually and also using Desmos for a few $n$:
| n | P(Y_n < 0.5) | P(Y_n > 1) |
|---|---|---|
| 1 | 0.5 | 0 |
| 2 | 0.25 | 0.5 |
| 3 | 0.020833... | 0.833... |
| 4 | 0.002604166... | 0.95833... |