30

It could seem an easy question and without any doubts it is but I'm trying to calculate the variance of white Gaussian noise without any result.

The power spectral density (PSD) of additive white Gaussian noise (AWGN) is $\frac{N_0}{2}$ while the autocorrelation is $\frac{N_0}{2}\delta(\tau)$, so variance is infinite?

Royi
  • 19,608
  • 4
  • 197
  • 238
Mazzy
  • 545
  • 1
  • 6
  • 12
  • Isn't the noise power the variance of the noise voltage? One could also ask about the variance (or standard deviation) of the power measured over a specific time interval. I think the central limit theorem would describe the relationship between the duration the measurement time and the variance of the results. –  Sep 09 '15 at 14:22

3 Answers3

35

White Gaussian noise in the continuous-time case is not what is called a second-order process (meaning $E[X^2(t)]$ is finite) and so, yes, the variance is infinite. Fortunately, we can never observe a white noise process (whether Gaussian or not) in nature; it is only observable through some kind of device, e.g. a (BIBO-stable) linear filter with transfer function $H(f)$ in which case what you get is a stationary Gaussian process with power spectral density $\frac{N_0}{2}|H(f)|^2$ and finite variance $$\sigma^2 = \int_{-\infty}^\infty \frac{N_0}{2}|H(f)|^2\,\mathrm df.$$

More than what you probably want to know about white Gaussian noise can be found in the Appendix of this lecture note of mine.

Dilip Sarwate
  • 20,349
  • 4
  • 48
  • 94
  • 2
    The curious thing about this for me is that the $\sigma^2$ parameter that is used as the "variance" of the Gaussian distribution of $x(t)$ is not the variance of the sequence. As you say, it's because $E[x^2(t)]$ is infinite. Thanks for the clear explanation! – Peter K. Apr 13 '13 at 00:29
  • 10
    @PeterK. There is a difference between the notions of white Gaussian noise for discrete time and continuous time. If a discrete-time process is considered as samples from a continuous-time process, then, taking into consideration that the sampler is a device with a finite bandwidth, we get a sequence of independent Gaussian random variables of common variance $\sigma^2$ which is what you have in your answer. If your $Y[n]$ is $$Y[n]=\int_{(n-1)T}^{nT}X(t),\mathrm dt$$ where $X(t)$ is the OP's AWGN, then $\sigma_{Y[n]}^2=\frac{N_0}{2}T$, not $\frac{N_0}{2}$ as you have it (except if $T=1$). – Dilip Sarwate Apr 13 '13 at 01:22
  • Understood, Dilip! I've not had this aspect pointed out before (or I did, and it's been forgotten). Good stuff! – Peter K. Apr 13 '13 at 02:18
  • 1
    @DilipSarwate I read your interesting appendix. But you say " One should not, however, infer that the random variables in the WGN process are themselves Gaussian random variables". I did not fully understand this. If the random variables aren't Gaussian (and this seems reasonable to me since they have infinite variance), why is the process named Gaussian? – Surfer on the fall Jul 04 '17 at 07:04
  • 1
    @Surferonthefall Try writing down the probability density function $f_{X(t)}(x)$ of the alleged Gaussian random variables in the white Gaussian noise process ${X(t)\colon -\infty < t < \infty}$. The density function has value $0$ for all $x$. How can $X(t)$ be viewed as a Gaussian random variable? As I said repeatedly in the document you read, one should not look too closely at the random variables in a white noise process ${X(t)\colon -\infty < t < \infty}$. The process is a mythical one and it is defined by what it produces at the output of linear filter, not by anything else. – Dilip Sarwate Jul 04 '17 at 14:21
  • 1
    @DilipSarwate sorry, I can't understand you. I know that, let's say $X_4$ (value assumed at t=4) is a random variable. Why can't it be Gaussian as WGN seems to suggest? [I originally thought that the process was a collection of independent gaussian variables, but this clearly conflicts with the variance being infinite.] I just can't get what you mean by saying "The density function has value 0 for all x."! Could you please explain it a bit? I would be very grateful to you! I find this topic really confusing as is traditionally treated :( – Surfer on the fall Jul 04 '17 at 15:08
  • 1
    The probability density function of a zero-mean Gaussian random variable is $$f_X(x) = \frac{1}{\sigma\sqrt{2\pi}}\exp(-x^2/2\sigma^2), -\infty < x < \infty.$$ What is the value of $f_X(1)$ if you "set" $\sigma=\infty$. or more properly, take the limit as $\sigma \to 0$? of $f_X(35.2869)$? of $f_X(x)$ for each and every choice of real number $x$, $-\infty < x < \infty$? – Dilip Sarwate Jul 04 '17 at 15:55
  • @DilipSarwate Ok, I got it. So you are saying that they are sort of degenerate gaussian random variable.. However this problem doesn't exist if we consider a white noise in a weak sense, so that S(f) is a large rect, since variance would then be finite and random variables just traditional Gaussian.. am I right? – Surfer on the fall Jul 04 '17 at 21:05
  • 2
    Sorry, that should have read ".... take the limit as $\sigma \to \infty$" not as $\sigma \to 0$. – Dilip Sarwate Jul 04 '17 at 21:27
  • @DilipSarwate sure, i got it. Could you tell me if thee reasoning in my last message is correct? Thanks again! – Surfer on the fall Jul 05 '17 at 06:04
  • 1
    I've just read this and I'm really confused now. AFAIK, WGN means that each $X(t_0)$ is a Gaussian random variable with finite variance, and independent from every other $X(t) \ \forall t\neq t_0$. Is this intuition wrong? – Tendero Mar 02 '18 at 22:12
  • 1
    @Tendero For continuous time, White Gaussian Noise is not what you say it is: for discrete time it is. See my comment in response to Peter K's comment above regarding discrete-time white Gaussian noise which in indeed "$X[n] \sim N(0,\sigma^2)$ for all $n$; $X[n]$ and $X[m]$ independent for $n\neq m$" but here $m$ and $n$ are restricted to be integers. The process that you state for continuous time is also called white Gaussian noise by some mathematicians, but it has vastly different properties. See this question over on math.SE – Dilip Sarwate Mar 02 '18 at 22:39
  • Nicely written lecture note - thanks for the link! – J Richard Snape Oct 07 '22 at 07:25
  • @DilipSarwate, I don't agree with the intuition you referred to with setting $ \sigma = \infty $ then all values are 0. White noise can be derived by a limit of differentiation of the Wiener process. It says on the contrary, that values approach high value. I think White Noise can only be seen as a limit of a process. – Royi Aug 07 '23 at 10:16
  • @Royi This is a chicken-end-egg problem. Some (including myself) define Brownian motion as the integral of white noise; others (including yourself) define white noise as the derivative of Brownian motion. An equivalent form of Kolmogorov's third axiom is that the probability of a limit is the limit of the probabilities, and I don't think that "values approach high value" quite cuts it. What is this "high value" that is being approached? Presumably it is a finite number? If so, what is it? Or is it just a euphemism for "increases without bound"? – Dilip Sarwate Aug 07 '23 at 15:28
8

Suppose we have a discrete-time sequence $x[t]$ which is stationary, zero mean, white noise with variance $\sigma^2$. Then the autocorrelation of $x$ is: $$ \begin{array} RR_{xx}[\tau] &=& E\left[ x[t] x[t+\tau] \right]\\ &=& \left \{ \begin{array} EE \left[ x[t]^2 \right], {\rm if\ }\tau=0 \\ 0, {\rm otherwise} \end{array} \right. \\ &=& \sigma^2 \delta[\tau] \end{array} $$ where $\delta[\tau]$ is the Kronecker delta.

So, that implies that $\sigma^2 = \frac{N_0}{2}$.

Peter K.
  • 25,714
  • 9
  • 46
  • 91
2

Yes it is: unless you take into account that infinite power is hard to come by in these post big-bang times. Actually all white noise processes end up in a physical implementation that has a capacitance and thus limits on the effective bandwidth. Consider the (reasonable) arguments leading to Johnson R noise: they would produce infinite energy; except there are always bandwidth limits in implementation. A similar situation applies at the opposite end: 1/F noise. Yes some processes fit 1/f noise very well over a long time; I have measured them. But in the end you are constrained by physical laws.

rrogers
  • 415
  • 3
  • 5