2

I have a question about the Shannon Theorem. According to this theorem the maximum capacity of a link can be calculated with the formula: $$C=B\log_2(1+S/N)$$ where $B$ is the bandwidth and $S/N$ is the signal to noise ratio. But I am not very sure which is the meaning of the "capacity" term in this equation, since it should include the probability of bit error in same way.

What I mean is that the capacity might not be defined as "number of bits transmitted per second" but as something like "number of bits transmitted and correctly received per second". But it still makes no sense, since you can transmit with low $S/N$ and symbol length tending to $0$, getting a capacity that tends to infinite and a probability of bit error close to 0.5 (it is theoretically imposible to have more than 50% of error rate), so the "number of bits transmitted and correctly received per second" would tend to infinite and it would be higher than the Shannon limit. I was thinking that the right definition of the $C$ in the formula could be "number of bits transmitted per second $\times$ (0.5- Probability of bit error)", which could have sense but I am not very sure which is the real meaning of this measure.

Dilip Sarwate
  • 20,349
  • 4
  • 48
  • 94
  • a better, more general relationship, when $S$ and $N$ are functions of frequency (i think we call it "power spectral density" or "PSD") is

    $$ C= \int\limits_{0}^{B} \log_2\left(1 + \frac{S(f)}{N(f)} \right) , df$$

    – robert bristow-johnson Jan 22 '18 at 06:36

1 Answers1

2

The formula you wrote corresponds to the error-free capacity. Namely, Shannon is telling us that $C$ is the maximum (theoretical) limit one can achieve.

Shannon managed to prove that if the information rate on the channel being used is less than $C$, then it is theoretically possible to find a coding system for the signal such that the transmission will have an arbitrarilly small probability of error. Unfortunately, the theorem does not tell us how to find that code, just that it exists.

That's why there is no reference to error rate in the formula for capacity. The formula itself is telling you the maximum information rate you can have with an arbitrary small amount of errors, and that value corresponds to $C$.

Sidenote: remember that the formula is only valid for white Gaussian noise.

Tendero
  • 5,020
  • 6
  • 27
  • 46
  • Just a nitpick: the theorem says that if the rate is less than $C$, then you can find a code such that the probability of error $\varepsilon>0$ is as small as you want -- but not zero. – MBaz Jan 21 '18 at 16:45
  • @MBaz True, I've edited the answer. – Tendero Jan 21 '18 at 16:59
  • white Gaussian noise has infinite power. i don't think you mean $N=\infty$. – robert bristow-johnson Jan 22 '18 at 06:31
  • and i am not sure your sidenote is valid anyhoo. – robert bristow-johnson Jan 22 '18 at 06:38
  • @robertbristow-johnson $N$ would be the variance of the noise (or the average power). Therefore, it is not infinite. The fact that the formula is valid for AWGN is correct. You can check that in Wikipedia. – Tendero Jan 22 '18 at 12:28
  • sorry, Ten, but your sidenote that *"remember that the formula is only valid for white Gaussian noise" is just not true. white noise has infinite bandwidth and when you integrate that over all frequencies you get infinite power. and above i shown the generalization of the formula for non-white signal power spectrum and non-white noise power spectrum. – robert bristow-johnson Jan 22 '18 at 17:08
  • @robertbristow-johnson Let me disagree. The channel hasn't an infinite bandwidth, so the noise present is actually $N=N_0 B$, where $N_0$ is the power density per Herz, so that $[N_0] = \mathrm{\frac{W}{Hz}}$. Also, $N$ clearly refers to the variance of the noise, which is indeed finite. Regarding the sidenote, one must assume AWGN in order to derive the equality $$C=\sup_{p_X(x)}I(X;Y)=B\log_2(1+S/N)$$ If you can provide a derivation that gets to that formula without using AWGN, please share it. IMHO, the sidenote is correct and I haven't been able to find anything saying the opposite online. – Tendero Jan 22 '18 at 17:21
  • the channel doesn't have infinite BW, but white noise, by definition, has infinite bandwidth and constant, non-zero spectral density over all frequencies and, as a consequence of Parseval, has infinite power. – robert bristow-johnson Jan 22 '18 at 18:00
  • and the proof of the integral formula above assumes that $\log_2\left(1 + \frac{S(f)}{N(f)}\right)$ is Riemann integrable. otherwise it's trivial. – robert bristow-johnson Jan 22 '18 at 18:03
  • @robertbristow-johnson Please check the original Shannon's paper, page 43. "A simple application of Theorem 16 is the case when the noise is a white thermal noise". Thermal noise is approximately Gaussian, therefore my sidenote. I can edit it to "thermal noise", but I don't think it's a mistake but a slight simplification. – Tendero Jan 22 '18 at 18:09
  • and if you consider a word-width of $w$ bits for each sample, uniform distribution over all possible $w$-bit words, pass these words as samples to an ideal linear D/A of a sample rate of $2B$, add uniform distribution noise of width $\Delta$ (the step-size), so the signal+noise power is $$S+N=\tfrac{(2^w \Delta)^2}{12}$$ and the noise power is $N=\tfrac{\Delta^2}{12}$, and pass that corrupted signal to an ideal linear A/D scaled the same as the D/A, you will also get the Shannon formula. this is not a proof of the formula, but is a counter example that the noise must be gaussian. – robert bristow-johnson Jan 22 '18 at 18:13