1

I have an SDR that is capabple of 2.4Ms/s. I am using it to measure the power of signals. Right now I am setting the sample rate to 2M, collecting a burst of samples, and calculating the average power using

10 * log10(sum(abs(sample)**2 for sample in samples) / len(samples))

This gives me the average power over 2MHz, I think. I'd like to limit that. One approach is to use a filter. But it would be simpler just to sample less frequently (sample at my target bandwidth). But will the output measurements be as accurate?

  • 1
    It's not clear what you want to achieve, and that dictates how you'd measure "accurate". So, give us the bigger picture of what you're doing. If you don't care about your signal, so that you can just sample it at arbitrary rates, I'd argue that you could also just not sample it at all and ignore it! – Marcus Müller Dec 03 '17 at 01:37

2 Answers2

3

Depends on your signal. Typically we try to sample at the Nyquist rate, which is equal to two times the maximum frequency of your signal. If you sample less often than this, you will lose information and your answer will not be correct, I think.

goldrik
  • 460
  • 3
  • 9
1

It also depends on the hardware you are using. Some have anti aliasing integrated such that it is set by the API when setting the sampling rate. You need to look at your API documentation.