-1

A random variable $N$ whose support is natural numbers is such that $$\sum_{i=1}^{N} X_i \geq 1$$ and $$\sum_{i=1}^{N-1} X_i < 1.$$

$X_i$'s are independent and identically distributed as $U(0,1)$.

What is $\mathbb{E}[N]$?

I want an approach using the tools of renewal theory, stochastic processes.

My try:

Using the classical approach one can find that $\Pr \{N=n\} = \frac{n-1}{n!}$. And we thus get $\mathbb{E}[N] = e.$

But how do we approach via stochastic processes?

Thanks in advance.

  • What can "approach via stochastic processes" even mean, if not what the classical approach does? – Did Jun 12 '17 at 08:21
  • @Did By "approach via stochastic processes" I mean can we use the results of martingales, or renewal theory, or even Wald's lemma for example to get the result of this problem. I know that $N$ is a stopping time if we consider $X_i$'s to be the interarrival random variables, but I am not able to formulate it properly. – Random-generator Jun 12 '17 at 08:26
  • You do not use renewal theory in what you call "the classical approach"? – Did Jun 12 '17 at 08:27
  • @Did Yeah. But can one use the results developed in renewal theory without explicitly finding the $\Pr{N=n}$ to get to the expected value of $N$? – Random-generator Jun 12 '17 at 08:32
  • This was already explained on the site but I re-posted an answer explaining the approach you have in mind (if I understand correctly what you are saying). – Did Jun 12 '17 at 08:39
  • @Did Didn't know this was a repost. Sorry for the trouble. – Random-generator Jun 12 '17 at 08:42

1 Answers1

3

To compute the expectation of $N=\inf\{n\mid X_1+\cdots+X_n>1\}$ without computing its distribution, the most direct approach might be to enlarge the setting, considering, for every nonnegative $x$, the mean number $n(x)$ of random variables uniform on $(0,1)$ that are necessary to get a sum larger than $1$, assuming that one starts at level $x$.

Thus, $n(x)=E(N_x)$ where $N_x=\inf\{n\mid x+X_1+\cdots+X_n>1\}$ and one is after $n(0)$.

First, $n(x)=0$ for every $x>1$. Second, for every $x$ in $[0,1]$, conditioning on the uniform random variable $X_1$, one gets the relation $$n(x)=1+E(n(x+X_1))$$ that is, $$n(x)=1+\int_0^{1-x}n(x+u)du=1+\int_x^1n(u)du$$ Differentiating, one gets $$n'(x)=-n(x)$$ for every $x$ in $[0,1]$, with $n(1)=1$, hence $$n(x)=e^{1-x}$$ for every $x$ in $[0,1]$, in particular,

$$E(N)=n(0)=e$$

Exercise: Using the same approach, compute $E(N^{(2)})$, where $N^{(2)}=\inf\{n\mid X_1+\cdots+X_n>2\}$. You should find $E(N^{(2)})=e^2-e$. Give a simple argument to explain why $E(N^{(2)})<2E(N)$.

Did
  • 279,727
  • For the exercise mentioned: A simple argument would be that, from $0$ to $1+\epsilon$ we take $E(N)$ and then we are starting off with the level $\epsilon$ in the second independent set i.e., from $1$ to $2$. But using the similar approach won't we end up with $n^{(2)}(x)=e^{2-x}$ with $n^{(2)}(x)=0$ for $x>2$ and $E(N^{(2)}) = n^{(2)}(0)$? – Random-generator Jun 12 '17 at 11:38
  • 1
    Yes this is the argument showing that $E(N^{(2)})<2E(N)$. Regarding the reasoning that would lead to $n^{(2)}(x)=e^{2-x}$ (which does not hold if $x<1$), I suggest that you write it down carefully to understand at which step it goes astray. – Did Jun 12 '17 at 12:26
  • $n(x)=1+E(n(x+X_1))$ doesn't work for this case, is it? – Random-generator Jun 12 '17 at 13:58
  • Huh? Why? (And sorry but this is not what I call writing the argument down carefully.) – Did Jun 12 '17 at 15:38
  • I'm still not getting what's wrong with this: $n(x)=1+\int_0^{2-x}n(x+u)du=1+\int_x^2n(u)du$, for all $x\in [0,2]$. I'd appreciate more hints. Thanks in advance. – Random-generator Jun 12 '17 at 18:13
  • You see? The parameter $u$ refers to the possible values of $X_1$, always in $(0,1)$, but you integrate from $0$ to $2-x$, which is larger than $1$ for some values of $x$. Not true. – Did Jun 12 '17 at 18:35
  • Got it! So the differential equation turns out to be $\frac{dy(x)}{dx }= e^{1-x}-y(x),$ with $y(1)=e$. Thanks for this awesome approach where one could solve for any threshold $c\in \mathbb{R}_+$. – Random-generator Jun 13 '17 at 07:37