In digital data transmission, the speed is measured in bits per second. 1) Is there any concept of speed in analog data transmission? 2) Shannon Hartley theorem limits the speed achieved in digital data communications. Is there any equivalent of the theorem in the analog data communication?
Additional information:
The Shannon–Hartley theorem states the channel capacity C, meaning the theoretical tightest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power S through an analog communication channel subject to additive white Gaussian noise of power N: C=Blog2(1+S/N), where C is the channel capacity in bits per second, a theoretical upper bound on the net bit rate (information rate, sometimes denoted I) excluding error-correction codes; B is the bandwidth of the channel in hertz (passband bandwidth in case of a bandpass signal); S is the average received signal power over the bandwidth (in case of a carrier-modulated passband transmission, often denoted C), measured in watts (or volts squared); N is the average power of the noise and interference over the bandwidth, measured in watts (or volts squared); and S/N is the signal-to-noise ratio (SNR) or the carrier-to-noise ratio (CNR) of the communication signal to the noise and interference at the receiver (expressed as a linear power ratio, not as logarithmic decibels).
Source https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theorem