1

On page 201 of https://stanford.edu/~dntse/Chapters_PDF/Fundamentals_Wireless_Communication_chapter5.pdf, it is mentioned that

This observation suggests that the capacity result (5.89) holds for a much broader class of fading processes. Only the convergence in (5.91) is needed. This says that the time average should converge to the same limit for almost all realizations of the fading process, a concept called ergodicity, and it holds in many models.

I just wanted to confirm if my interpretation of this is correct: in equation (5.91), the $h[m]$ are random variables taken from a discrete random process $h$. This leads to another discrete random process $\log(1+ |h|^2 SNR)$ and assuming it is ergodic in the mean, equation 5.91 holds.

robert bristow-johnson
  • 20,661
  • 4
  • 38
  • 76
Ray
  • 125
  • 3
  • It seems to me that, *directly from* the meaning of "ergodic" (in every sense), that Eq. 5.91 holds. That's the meaning of ergodicity. If it's generally ergodic, that means that every time average of *any* continuous function of $h[n]$ must be equal to the probabilistic average of the same function of $h[n]$ (i.e. the Expected Value of that same function of $h[n]$). I do not know why they say: "By the law of large numbers, ..." It's because it's ergodic. – robert bristow-johnson Jan 04 '24 at 20:35
  • You might find this answer to be useful, or, perhaps adding to your confusion! – Dilip Sarwate Jan 05 '24 at 00:09
  • @robertbristow-johnson "By the law of large numbers" is sufficient to have the converge, which is also the consequence of ergodicity assumption embedded in chosen models. Just being rigorous I think. – AlexTP Jan 05 '24 at 10:13

1 Answers1

1

I'll change the notation a little. Let's say you have a discrete-time process, $x[n]$, random or deterministic, that exists for all discrete times $n$, and let's say it goes into some "decent" (I think continuous) function $f(x)$.

The time-average of the result of that function is:

$$ \overline{f(x)} \triangleq \lim_{N \to \infty} \frac{1}{2N+1} \sum\limits_{n=-N}^{+N} f\big( x[n] \big) $$

Now, let's say that $x[n]$ is a stationary random process, so all statistics of $x[n]$ are constant with respect to (discrete) time $n$. Then the probabilistic-average of the result of that function is the expectation value:

$$ \mathbb{E}\Big\{ f\big( x[n] \big)\Big\} \triangleq \int\limits_{-\infty}^{+\infty} \mathrm{p}_x(\alpha) f\big( \alpha \big) \ \mathrm{d}\alpha $$

where $\mathrm{p}_x(\alpha)$ is the probability density function (p.d.f.) of the random variable $x[n]$ and is independent of $n$:

$$ \int\limits_{\alpha}^{\alpha + \Delta \alpha} \mathrm{p}_x(u) \ \mathrm{d}u = \mathbb{P}\Big\{\alpha \le x[n] < \alpha + \Delta \alpha \Big\} $$

or, for tiny $\Delta \alpha$,

$$ \mathrm{p}_x(\alpha) = \lim_{\Delta \alpha \to 0} \frac{1}{\Delta \alpha} \mathbb{P}\Big\{\alpha \le x[n] < \alpha + \Delta \alpha \Big\} $$

and $\mathbb{P}\big\{\cdot\big\}$ means the probability of the event defined therein.

Now, my understanding of the root meaning to the term ergodic as applied to a random process $x[n]$, is that every time average is the same as the probabilistic average. That is, for any function $f(\cdot)$,

$$ \overline{f(x)} = \mathbb{E}\Big\{ f\big( x[n] \big)\Big\} $$

That's what "ergodic" means. These are two different ways of getting to the average of something, and "ergodic" means that those two different ways of getting to the average, get to the same average.

robert bristow-johnson
  • 20,661
  • 4
  • 38
  • 76