4

There is an analog system which includes the continuous-time linear equalizer (CTLE). With some .noise analysis the power-spectral density (PSD) of the noise in that system is provided. So let's not care about the details how PSD is provided, the only thing that matters is that I have access to the noise PSD of the circuit. The noise PSD is not necessarily white and it can have an arbitrary shape, but it dies out after a certain frequency (let's say most energy is contained within 100 GHz). By having the noise PSD, I integrate it over a large enough frequency range (in this example from 0 to 100 GHz) to get the (almost) total power of the noise, let's name it $\sigma^2$.

Now, in my MATLAB simulation I have a clean sampled signal $s(n \,\,\Delta t)$ with sampling frequency $f_s$, where $f_s = 1 / \Delta t$ (To add more details, $s(t)$ is a continuous-time signal that has a bandwidth of 30GHz and I am oversampling $s(t)$ by taking 32 samples per symbol time. So the sampling frequency in this example would be $ f_s $ = 32 x 30GHz = 960 GHz . Maybe these details don't matter).

I want to add discrete white Gaussian noise to $s(n \,\,\Delta t)$ such that this discrete Gaussian noise has the same power (i.e. variance) as the continuous (non-white) noise which was calculated to be $\sigma^2$. I know I can do something like $N = \sigma$ x randn( size(s) ) in MATLAB to generate noise with power $\sigma^2$, but my main question is that what is the dependency of the discrete noise in this example to the sampling rate $f_s$. If I change $f_s$ how the randn() function or the $\sigma$ should change.

Cause I see in some pages it is mentioned that:

"Given a continuous White Noise (Wide Sense) with variance $\sigma^2$ and you want sample it at rate of $f_s$ you should generate discrete noise samples with variance of $f_s \, \sigma^2$",

but I don't know what is the mathematical reasoning behind it and if the same applies in my case or not.

If you have any good read on this matter, that would be really appreciated.

Royi
  • 19,608
  • 4
  • 197
  • 238
shampar
  • 328
  • 4
  • 15
  • I would refer to this answer https://dsp.stackexchange.com/a/8632/26081. Ideal sampling is filter with finite bandwidth $f_s$, then you generate samples with variance $f_s \sigma^2$. – AlexTP May 31 '18 at 12:07
  • I completely agree with @Royi that "It is important to have definitions straight." This really is about statistical communications, or communications within a context of added noise. We need to be careful about the definition of "white noise". You do well by giving it an overall limiting bandwidth that you say is 100 GHz. If you are replacing your noise with white noise over that same bandwidth, then that limiting frequency must be part of your mathematical description, if you are to understand a finite variance, $\sigma^2$ of the noise. That "100 GHz" must be part of it. – robert bristow-johnson Aug 07 '23 at 01:43
  • So, at first stab, the equation @Royi put in, should be changed to:

    $$ \sigma_n^{2} = \frac{f_\mathrm{s}}{2 \times 100 , \text{GHz}} \ \sigma^2 $$

    – robert bristow-johnson Aug 07 '23 at 01:47
  • Related: https://dsp.stackexchange.com/a/87654/21048 – Dan Boschen Aug 11 '23 at 03:43

1 Answers1

0

It is important to have definitions straight.
The RMS of a white noise, since it has zero mean, is its standard deviation.
So it is easier to talk on the Variance (The squared RMS or the MS).

As you wrote, the Variance of a sampled continuous white noise is given by:

$$ {\sigma}_{n}^{2} = {f}_{s} {\sigma}^{2} $$

Where $ {\sigma}_{n} $ is the RMS of the sampled noise and $ {R}_{xx} \left( t \right) = {\sigma}^{2} \delta \left( t \right) $ is the auto correlation of the White Noise Process.

As we can see, the noise RMS is higher the LPF bandwidth of the sampling system is. This is why it is advised to sample according to the data BW in order to accumulate the least noise as possible.

Remark 001: Just to make things more coherent. The actual noise variance is defined by the LPF of the sampling system. So if the LPF BW is given by $ B $ (Assuming ideal LPF) then the noise variance is given by $ {\sigma}_{n}^{2} = B {\sigma}^{2} $ where $ B $ is the total area below the LPF graph (Both sides).

Remark 002: White noise, in the context of Signal Processing, is usually defined only by its Auto Correlation function. Mathematically there are deeper models to define it. Yet it is out of the scope of this question.

Royi
  • 19,608
  • 4
  • 197
  • 238
  • Can you be more mathematically specific about your answer. Basically what you propose does not make sense energy-wise. Cause higher sampling frequency means more frequency separation and hence wider bandwidth for the white noise. So I don't get why the RMS should remain the same. I tried to explain the problem in other words in the following post. Please read it and let me know if you have any explanation on that. https://dsp.stackexchange.com/questions/49574/frequency-spectrum-of-a-sampled-signal-psd-and-power-discussion – shampar Jul 19 '18 at 19:52
  • It does makes sense. Imagine ideal LPF. Then the RMS of the Noise going through the analog chain (LPF + Sampling) is exactly the White Noise Level multiplied by Bandwidth. – Royi Apr 16 '21 at 09:26
  • @shampar, Could you please mark my answer? – Royi Jun 30 '22 at 05:46
  • 1
    Hay, I just saw this, 5 years later. Royi, continuous-time white noise (ya know the thing we represented with $\frac{\eta}{2}$ as the level over all frequencies, positive and negative) has infinite bandwidth, infinite power, infinite variance, and infinite standard deviation. In continuous time, white noise only makes sense in a model when that $\frac{\eta}{2}$ is multiplied by some bandwidth $B$ in both the positive and negative frequencies (leaving a noise variance of $\eta B$). But, of course, sampled signals (including noise signals) have finite bandwidth $\frac{f_\mathrm{s}}{2}$. – robert bristow-johnson Aug 05 '23 at 04:07
  • @robertbristow-johnson, The bandwidth is $ {f}{s} $. It is spanned over $ \left[ -\frac{ {f}{s} }{2}, \frac{ {f}_{s} }{2} \right] $. So the above holds. To make it more accurate, we actually need to take into account the BW of the low pass filter. So there is an implicit assumption the pre sampling low pass filter matches exactly the sampling frequency. This assumption is common, so no issues in there. – Royi Aug 05 '23 at 09:46
  • @Royi we agree on the physical reality. We might have a difference in convention. I went back to my old Communications Systems textbooks (my A. Bruce Carlson is my favorite). They define *white noise* as having this constant power spectrum: $$ S_{nn}(f) = \frac{\eta}{2} \qquad \forall f \in \mathbb{R} $$ Of course the area under that curve (which is the total power) is infinite. So, in order to get to a finite power (which is equal to the variance of the zero-mean stationary random process) you multiply that by the bandwidth *both* in the negative and positive frequencies. – robert bristow-johnson Aug 06 '23 at 20:04
  • @robertbristow-johnson, Both sides is exactly what I did and you didn't. It is really simple, it is an integral of a constant. So the integral, both sides, is the variance of the sampled noise. – Royi Aug 06 '23 at 20:09
  • So, no matter where the band is (how far bumped up to some IF or RF frequency), if the one-sided bandwidth (which is what we deal with when we examine bandpass signals and the analytic signal, $x_\mathrm{a}(t) = x(t) + j \hat{x}(t)$ is defined to be $B$, and it's the same $B$ when the Sampling Theorem is considered ($f_\mathrm{s}>2B$). Then the finite power (and variance) is $$ \sigma_n^2 = \eta B$$. – robert bristow-johnson Aug 06 '23 at 20:10
  • Now, you can, in the limit, bring $B$ right up to virtually $\frac{f_\mathrm{s}}{2}$. If you do, then the variance is $$ \sigma_n^2 = \eta\frac{f_\mathrm{s}}{2} $$ and that is the variance you would use to simulate sampled white noise with a random number generator. So, suppose you had a good RNG that generated really good and independent uniform random values between 0 and 1. If, for each sample of simulated noise, you generated 12 of those values, added them up, and subtracted 6, then you would have something very close to Gaussian "white" noise with zero mean and unity variance. – robert bristow-johnson Aug 06 '23 at 20:13
  • Now, that's what $\sigma_n^2 = 1$ would be. Then, if you divide that by $\frac{f_\mathrm{s}}{2}$ you get $$ \eta = \frac{2}{f_\mathrm{s}} $$ and half of that is the height of the $S_{nn}(f)$ curve in between $-\frac{f_\mathrm{s}}{2}$ and $+\frac{f_\mathrm{s}}{2}$. The $S_{nn}(f)$ power spectrum is *zero* for all other frequencies. But that's not exactly "white", is it? – robert bristow-johnson Aug 06 '23 at 20:22
  • @robertbristow-johnson, We're talking on real valued low pass filter. It is 2 sided. The low pass filter is what important. The area of the integral below the LPF is the variance. So if the input is white noise -> constant, then is it just multiplication of that constant by the area of the LPF which is what I did. The notation of $ \frac{\eta}{2} $ is unique to communication because the way some integrals there are defined. – Royi Aug 06 '23 at 20:23
  • I agree. But we define $B$ as the one-sided bandwidth. Both for noise spectrum and for the sampling theorem. It's a consistent convention. And, when you have sampled white noise and you wanna look at the spectrum of that in the *continuous*-time domain (so that you can discuss what happens when the sample rate is changed), that's what you gotta do. – robert bristow-johnson Aug 06 '23 at 20:25
  • @robertbristow-johnson, Call it anyway you'd like. It is just the area beneath the LPF. In case the LPF matches exactly the sampling rate then its width is $ {f}_{s} $. Which is what I used. – Royi Aug 06 '23 at 20:29
  • @robertbristow-johnson, I added a remark about the LPF and the area below it. – Royi Aug 06 '23 at 20:32
  • I think the statement that first caught my attention was "$\sigma$ is the standard deviation of the White Noise." But there is no finite variance, no finite standard deviation for the concept we call "white noise". It's infinite and you only get a finite value with a finite bandwidth for the system. Of course that finite bandwidth is directly related to the sampling frequency. When talking about how high the $S_{nn}(f)$ curve is for white noise, you should not call it the "variance" or "standard deviation" and you should not call it "$\sigma$". That's the reason they use another symbol. – robert bristow-johnson Aug 06 '23 at 20:36
  • @robertbristow-johnson, It is the parameter defining the variance (Whatever multiplies the delta function). – Royi Aug 06 '23 at 20:37
  • The value of $R_{xx}(\tau)$ at $\tau = 0$ is the area under the curve of $S_{xx}(f)$ for all $f$. But the area under the curve is the *power* of $x(t)$. It's the same as $\sigma_x^2$. It's the variance of $x(t)$ at some randomly-picked time $t$. – robert bristow-johnson Aug 06 '23 at 20:42
  • You should use $\sqrt{\frac{\eta}{2}}$ instead of "$\sigma$". And you shouldn't call it "standard deviation". $\frac{\eta}{2}$ is the power per unit frequency. It is not the power. For zero-mean processes, variance and power are the same value. – robert bristow-johnson Aug 06 '23 at 20:44
  • @Royi, physical reality doesn't give a rat's ass about human notational convention. It is my choice to stick with an already long established notational convention, simply so that if textbooks or other literature is consulted, we might not have to worry about some factor of 2 or 4 because of choice of convention and we can use the equations and narrative of that lit to read, gain insight, and understand the present situation. So whether $\eta$ is divided by 2 or not isn't what I am bitching about. But it is objectively false to write or imply that white noise has a finite standard deviation. – robert bristow-johnson Aug 06 '23 at 23:12
  • @Royi you wrote this: "It is the parameter defining the variance (Whatever multiplies the delta function)." - - - It's crap. It's objectively crap. At least it is if what you mean is the delta function regarding the autocorrelation of white noise. I am taking issue with you about any claim of white noise having a (finite) variance or standard deviation. And I am taking issue with you about relating variance of a random process to a dirac delta function that may represent (theoretically) the autocorrelation of the random process. – robert bristow-johnson Aug 06 '23 at 23:17
  • @robertbristow-johnson, The actual common notation is using $\sigma$. See Wikipedia White Noise - https://en.wikipedia.org/wiki/White_noise. There is no factor of 2 or anything. What I wrote is correct with no factors. In your notations $ \eta = \frac{\sigma}{2} $. By the way, usually it is called $ {N}_{0} $. Have a look at: https://www.probabilitycourse.com/chapter10/10_2_4_white_noise.php. The random process has a finite variance! It is the auto correlation which has infinite energy. – Royi Aug 06 '23 at 23:19
  • I'm in agreement with the probability course link. They got it right. You still do not. I wouldn't depend on Wikipedia to be authoritative over the texts and the literature. Now, what I want to nail you down on, does this white noise have a finite variance (or finite standard deviation) or not? Will you answer that? – robert bristow-johnson Aug 06 '23 at 23:21
  • @robertbristow-johnson, You may go through the definition of Gaussian Process: https://en.wikipedia.org/wiki/Gaussian_process. You'd see how it relates to Brownian Motion and White Noise. – Royi Aug 06 '23 at 23:27
  • You're wrong and objective so. Before you sample *anything, you better bandlimit it to half of the sample rate. If you don't do that, all of the images above Nyquist (and the power associated with those images) will foldover into the baseband. So when you say that the samples have finite power and finite variance, it's only* because of the bandlimit. That reference to the probability course agrees with me right out of the starting block. – robert bristow-johnson Aug 06 '23 at 23:27
  • @robertbristow-johnson, Before tackling the Random Process through linear system, now you agree that White Noise has finite variance? Because you wanted to nail me with this :-). – Royi Aug 06 '23 at 23:29
  • Remember white noise has *infinite* power. It's a concept, not a real thing. Even Johnson-Nyquist noise, even though is normally modeled as white going to infinity, actually has a finite bandwidth associated with a rolloff in frequency at very very high frequencies (like visible light frequencies). But we model it as a constant power spectrum to infinity because it easier. – robert bristow-johnson Aug 06 '23 at 23:30
  • @robertbristow-johnson, You're mixing Auto Correlation and time stamps. It has infinite power like any signal with infinite support and a function which doesn't decay (Hence becomes non integrateable). Yet, per sample, it has well defined variance which is not infinite. If it was infinite it couldn't have been Gaussian. – Royi Aug 06 '23 at 23:32
  • //" Before tackling the Random Process through linear system, now you agree that White Noise has finite variance?"// - - - - You're not being intellectually honest here. You better read what I said. Read what you said. Read what your probability course reference says. Read what the textbooks say.

    I am saying that you're *objectively* mistaken when you claim that (theoretically modeled) white noise, before it's bandlimited by a system, has finite power. It simply does not.

    – robert bristow-johnson Aug 06 '23 at 23:33
  • //"You're mixing Auto Correlation and time stamps. It has infinite power like any signal with infinite support and a function "// Bullshit. It's *not* because it has infinite support in the time axis. It's because it has infinite support on the frequency axis and does not decay at high frequencies. – robert bristow-johnson Aug 06 '23 at 23:34
  • Read the damn link you cited. And be intellectually honest with it and intellectually courageous. – robert bristow-johnson Aug 06 '23 at 23:35
  • *White noise is a theoretical notion that does not exist in physical reality. White noise does not have finite power. Bandlimited white noise (which is not the same as white noise before bandlimiting) does have finite power that is proportional to the bandwidth.* – robert bristow-johnson Aug 06 '23 at 23:37
  • This is not about the difference between finite power and finite energy signals. Bandlimited white noise is still a finite power and not-finite energy signal (because of infinite support in the time axis). But white noise, as a concept, that we model with a constant in power spectrum and as a dirac delta as autocorrelation, has infinite power. It has to. And you mislead if you imply that it has finite power. It simply does not. – robert bristow-johnson Aug 06 '23 at 23:40
  • //" If you have a White Noise: $X(t) \sim N(0, \sigma^2)$ ..."// But "White Noise" does *not* have variance of any finite $\sigma$. What you are assuming (and passing off as knowledge, which misleads) is false. Objectively so. – robert bristow-johnson Aug 06 '23 at 23:53
  • Also, $R_{XX}(\tau) = \sigma^2 \delta(\tau)$ implies that the power of $X(t)$ is infinite. It is not the area under $R_{XX}(\tau)$ that is the power. It is the *value* of $R_{XX}(\tau)$ at $\tau=0$ that is the power. – robert bristow-johnson Aug 06 '23 at 23:58
  • @Royi , to be more specific, in a comment directly to the question, I posted the net correction. One way to know that something is missing is to do a little bit of unit or dimensional analysis. You can't take a variance, scale it by frequency (which is not dimensionless) and come of with another variance of something having the same units. – robert bristow-johnson Aug 07 '23 at 01:54
  • @robertbristow-johnson, I don't want to get into the Math, you may look on some of the books in the field of Measure Theory. AWGN is a limit of Gaussian Vector defined as I defined. When I say the integral doesn't converge I meant the integral defined by the Auto Correlation function. I will give you food for though, What's the auto correlation of a white random continuous noise with uniform $ \left[ 0, 1 \right] $ distribution? Give it a thought. It will make everything with sense for you. – Royi Aug 07 '23 at 06:44
  • If you want to see the deep Math and what I meant by the auto correlation doesn't converge look at https://math.stackexchange.com/questions/134193. Now you'd see that the use of delta is just because in the limit the Gaussian Process which is IID is not integrateable. Again, the Math model learned in EE is not enough to rigorously understand White Noise. This is what causes you headache. – Royi Aug 07 '23 at 06:58
  • A way around this problem is using the derivative of the Wiener Process: http://mbhauser.com/informal-notes/white-gaussian-noise.pdf. You see the $ \sigma $? – Royi Aug 07 '23 at 07:03
  • Yes we know that Brown noise (also called the "Random walk" "or "Drunk's walk"* which has power spectral density, PSD, proportional to $\frac{1}{f^2}$) is the integral w.r.t. time of white noise. Power is still the integral over all $f$ of PSD. And, if you do not include integrating around $f=0$ (nasty singularity), the power of Brown noise would come out to be finite. Doesn't change a thing. And it doesn't matter whether it's learned as EE or not, White noise has infinite power, not finite power. There is no finite variance for White noise and your answer is wrong and misleading. – robert bristow-johnson Aug 07 '23 at 12:44
  • @robertbristow-johnson, I couldn't respond you for few days. Anyhow, I think we have saturated the discussion. If you want the complete derivation of White Noise I can only suggest looking at the literature of the Measure Theory. There are method to derive it in a way which is more complete than only having the auto correlation. – Royi Aug 17 '23 at 13:38
  • @Royi, I have had measure theory when I was in my Real Analysis course 40 years ago. I don't give a rat's ass. We're talking about two different things. Any "white noise" that you might refer to that has a finite power without a bandlimiting parameter is not the white noise that electrical engineers and signal processing and communications systems engineers are discussing. – robert bristow-johnson Aug 17 '23 at 18:13
  • The power spectral density of *any* random or deterministic process (that is a power signal is given by $$S_{xx}(f)=\mathscr{F}\Big{R_{xx}(\tau)\Big}$$ where $$R_{xy}(\tau)=\lim_{T\to+\infty}\frac{1}{T}\int_{-\frac{T}2}^{\frac{T}2}x(t)y^(t+\tau)\ \mathrm{d}t$$ where $y^(t)$ is the complex conjugate of $y(t)$. That is the mathematical basis that I start with. Ergodicity adds another expression in that the time-domain averages are the same as probabilistic averages. – robert bristow-johnson Aug 17 '23 at 19:38
  • @robertbristow-johnson, The Wiener Khinchin Theorem you use above actually doesn't hold unless you assume some correlation for the noise process. It can be shown mathematically. I think we have gone too far so we don't remember why we started. White Noise is defined differently by Signal Processing practitioners and Physics / Measure Theory people. The Signal Processing derivation lacks some depth you can get from others. For instance, the White Gaussian Noise Integral is has smooth realizations path. What does it mean for the samples of the White Noise realization? – Royi Aug 18 '23 at 04:31
  • //" The Wiener Khinchin Theorem you use above actually doesn't hold unless you assume some correlation for the noise process."// - - -

    Such as $$ R_{xx}(\tau) = \frac{\eta}{2} \delta(\tau) $$

    //" White Noise is defined differently by Signal Processing practitioners and Physics / Measure Theory people."// Not differently than the physicists. Electrical engineers and physicists are looking at it about the same, I think.

    – robert bristow-johnson Aug 18 '23 at 04:38
  • Now, I will agree with you that physicists might not look at thermal noise as white. Johnson noise is not infinite in power, so it rolls off somewhere. My book says that the power spectrum of the thermal noise of a resistor, $R$, is: $$ S_v(f) = \frac{4\pi R \hbar |f|}{e^{2\pi \hbar |f|/(k_\mathrm{B}T)}-1} $$

    SI units would be volts-squared per hertz.

    – robert bristow-johnson Aug 18 '23 at 04:45
  • @robertbristow-johnson, Read "Roy M. Howard - On Defining White Noise". It will give you a broader look. From its abstract "It is shown that it is necessary to accept an arbitrarily small correlation time, and a flat power spectral density approximation over a finite, but arbitrarily large, frequency range, to adequately define a white noise random process consistent with the Wiener-Khintchine relationships. ". It will broaden some Math concepts. By the way, it is what Wiener did when he derived White Noise. I know well the Math you're writing deeply. Where would you like this to go? – Royi Aug 18 '23 at 04:46
  • @robertbristow-johnson, Simple question, please don't skip it like you did with the others. Does the Wiener Process have a continuous realization path? – Royi Aug 18 '23 at 04:47
  • @Royi, this is not what the question is about. At least it's not what the title of the question is about. Again, in the context where we are, white noise has infinite power because it has infinite bandwidth. It's not really a physical thing. In the context here, we look at dirac impulse functions and *functions* and not worry too much about the fact from Real Analysis that says if $f(x)=g(x)$ almost everywhere, then their integrals are equal. Because we say that $\int \delta(x) , \mathrm{d}x = 1$ and $\int 0 , \mathrm{d}x = 0$ yet they are equal almost everywhere. – robert bristow-johnson Aug 18 '23 at 04:54
  • And the Wiener-Khintchine theorem is what says that the power spectral density is related to the autocorrelation function by the Fourier Transform. At least that is all I am going to subscribe to it. – robert bristow-johnson Aug 18 '23 at 04:56
  • So now ask the question that the OP is asking. Does any of your measure theory apply? Or does the regular old, meat and potatoes electrical engineering understanding of it apply? – robert bristow-johnson Aug 18 '23 at 04:57
  • @robertbristow-johnson, Measure theory and random processes deep math has nothing to do with the question. You went there. Again, White Noise as taught in Signal Processing context is not the whole story. There are more complete Math around it. – Royi Aug 18 '23 at 05:32
  • @robertbristow-johnson, To offer a deeper discussion, I opened https://dsp.stackexchange.com/questions/89080. You may add your point of view. Please read mine as well. – Royi Aug 18 '23 at 16:57