I have a question about the Shannon Theorem. According to this theorem the maximum capacity of a link can be calculated with the formula: $$C=B\log_2(1+S/N)$$ where $B$ is the bandwidth and $S/N$ is the signal to noise ratio. But I am not very sure which is the meaning of the "capacity" term in this equation, since it should include the probability of bit error in same way.
What I mean is that the capacity might not be defined as "number of bits transmitted per second" but as something like "number of bits transmitted and correctly received per second". But it still makes no sense, since you can transmit with low $S/N$ and symbol length tending to $0$, getting a capacity that tends to infinite and a probability of bit error close to 0.5 (it is theoretically imposible to have more than 50% of error rate), so the "number of bits transmitted and correctly received per second" would tend to infinite and it would be higher than the Shannon limit. I was thinking that the right definition of the $C$ in the formula could be "number of bits transmitted per second $\times$ (0.5- Probability of bit error)", which could have sense but I am not very sure which is the real meaning of this measure.
$$ C= \int\limits_{0}^{B} \log_2\left(1 + \frac{S(f)}{N(f)} \right) , df$$
– robert bristow-johnson Jan 22 '18 at 06:36