2

Since Fourier transform of a random stationary process in time (in the case of existence) is not necessarily real, my question is what is the relation between the covariance of real and imaginary parts of the transformed random variable.

More specifically, according to Wiener-Khinchin theorem for the time function $x(t)$ with some conditions the Fourier transform $X(\omega)$ exists.

The question is can one recover $$\mathbb{E}\left[\Re\{X(\omega)\},\Re\{X(\omega')\}\right]-\mathbb{E}\left[\Re\{X(\omega)\}\right]\mathbb{E}\left[\Re\{X(\omega')\}\right]$$ and $$\mathbb{E}\left[\Im\{X(\omega)\},\Im\{X(\omega')\}\right]-\mathbb{E}\left[\Im\{X(\omega)\}\right]\mathbb{E}\left[\Im\{X(\omega')\}\right]$$

from $$\mathbb{E}\left[\bar{x}(t),x(s)\right]-\mathbb{E}\left[\bar{x}(t)\right]\mathbb{E}\left[x(s)\right]\;?$$

I would be happy to be guided to a reference as well.

This is the most general case but the simpler case where the input signal is real is the main concern for me. I would be happy to be guided to a reference as well.

Matt L.
  • 89,963
  • 9
  • 79
  • 179
Cupitor
  • 159
  • 1
  • 13
  • 1
    boy, i hope i didn't mangle the meaning by trying to fix notation. i know that some mathematicians use a different notation for the Fourier Transform (with $\hat{x}$), but can you stick with what is conventional for electrical engineers? also, can you define what you mean by "$\bar{x}(t)$"? is it complex conjugate? – robert bristow-johnson Jul 07 '14 at 18:25
  • Thank you very much for your edit. I'll do my best, although I am not from a signal processing background :) Its exactly complex conjugate. – Cupitor Jul 08 '14 at 12:27
  • 1
    I'm not sure I understand the question. Is this what you want: http://dsp.stackexchange.com/questions/13346/how-to-compute-the-statistics-of-the-dft-of-noise ? – DanielSank Jul 09 '14 at 02:41
  • @DanielSank, yes that is very much related. But I don't understand what the answerer means by sampling enough samples. I am talking about a stochastic process in a complete theoritical framework. I don't need any empirical statistic. I believe actually the answer is that we get a diagonal covariance matrix for $X(\omega)$ and $\mathbb{E}\left[\Im{X(\omega)},\Im{X(\omega')}\right]-\mathbb{E}\left[\Im{X(\omega)}\right]\mathbb{E}\left[\Im{X(\omega')}\right]$ should be zero for nonequal frequencies and should be equal to the covariance for real part when $\omega=\omega'$ – Cupitor Jul 09 '14 at 16:58
  • I think the basic answer to your question is that the real and imaginary parts are jointly Gaussian distributed. Does that make sense / answer your question? – DanielSank Jul 09 '14 at 18:40
  • Well that is a fact. The point is how should one calculate it with respect to the covariance in time domain. As I said as far as I know the facts that I mentioned are true. But I don't know how to prove them. – Cupitor Jul 09 '14 at 22:50

1 Answers1

5

Note that in general the Fourier transform of a stationary process $x(t)$ does not exist. The Wiener-Khinchin theorem only states that under certain conditions the power spectral density of $x(t)$ exists, and it can be computed as the Fourier transform of the autocorrelation function of $x(t)$.

Having said that, if for some reason one assumes that the Fourier transform of $x(t)$ exists, then you can do the math and see if you get a useful result. So, let's see. I assume that $x(t)$ is real-valued. This is not at all necessary, but just simplifies things a bit.

Since $\text{Re}\{X(\omega)\}=\frac12[X(\omega)+X^*(\omega)]$ and $\text{Im}\{X(\omega)\}=\frac{1}{2j}[X(\omega)-X^*(\omega)]$, we can compute the desired expectations from $E\{X(\omega)X(\omega')\}$ and $E\{X^*(\omega)X(\omega')\}$. For the first of these expectations we have

$$E\{X(\omega)X(\omega')\}=E\left\{\int_{-\infty}^{\infty}x(t)e^{-j\omega t}dt \int_{-\infty}^{\infty}x(t')e^{-j\omega' t'}dt' \right\}=\\ =\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}E\{x(t)x(t')\}e^{-j\omega t}e^{-j\omega't'}dtdt'$$

With the substution $\tau=t'-t$ and with the autocorrelation function $R_x(\tau)=E\{x(t)x(t+\tau)\}$ (assuming wide-sense stationarity of $x(t)$) we obtain

$$E\{X(\omega)X(\omega')\}=\int_{-\infty}^{\infty}R_x(\tau)e^{-j\omega'\tau}d\tau\int_{-\infty}^{\infty}e^{-j(\omega+\omega')t}dt$$

Interpreting the second integral as a distribution, and with the power spectrum $S_x(\omega)=\mathcal{F}\{R_x(\tau)\}$ we finally get

$$E\{X(\omega)X(\omega')\}=2\pi\delta(\omega+\omega')S_x(\omega')= 2\pi\delta(\omega+\omega')S_x(-\omega)=2\pi\delta(\omega+\omega')S_x(\omega)\tag{1}$$

because $S_x(\omega)$ is an even function (we assumed $x(t)$ to be real-valued). The other expectation $E\{X^*(\omega)X(\omega')\}$ can be derived in a completely analogous manner. The result is

$$E\{X^*(\omega)X(\omega')\}=2\pi\delta(\omega-\omega')S_x(\omega)\tag{2}$$

From (1) and (2) we can obtain the desired expectations

$$E\{\text{Re}\{X(\omega)\}\text{Re}\{X(\omega')\}\}= \pi S_x(\omega)[\delta(\omega-\omega')+\delta(\omega+\omega')]\\ E\{\text{Im}\{X(\omega)\}\text{Im}\{X(\omega')\}\}= \pi S_x(\omega)[\delta(\omega-\omega')-\delta(\omega+\omega')]$$

It remains to compute the quantities $E\{\text{Re}\{X(\omega)\}$ and $E\{\text{Im}\{X(\omega)\}$. We can easily derive them from the expectations $E\{X(\omega)\}$ and $E\{X^*(\omega)\}$:

$$E\{X(\omega)\}=E\left\{ \int_{-\infty}^{\infty}x(t)e^{-j\omega t}dt \right\}= \int_{-\infty}^{\infty}E\{x(t)\}e^{-j\omega t}dt= \mu_x\int_{-\infty}^{\infty}e^{-j\omega t}dt=2\pi\mu_x\delta(\omega)$$

with $\mu_x=E\{x(t)\}$, where we have again assumed wide-sense stationarity of $x(t)$. Obviously, we get the same result for $E\{X^*(\omega)\}$:

$$E\{X^*(\omega)\}=2\pi\mu_x\delta(\omega)$$

This results in

$$E\{\text{Re}\{X(\omega)\}\}=2\pi\mu_x\delta(\omega)\quad\text{and}\quad E\{\text{Im}\{X(\omega)\}\}=0$$

So we have

$$E\{\text{Re}\{X(\omega)\}\}\cdot E\{\text{Re}\{X(\omega')\}\}=4\pi^2\mu_x^2\delta(\omega)\delta(\omega')\\ E\{\text{Im}\{X(\omega)\}\}\cdot E\{\text{Im}\{X(\omega')\}\}=0\tag{3}$$

Combining (3) with (1) and (2) you obtain the desired result. From the result you can see that the covariances of the real parts of $X(\omega)$ and $X(\omega')$ vanish everywhere except for $\omega=\pm\omega'$. The same is true for the covariances of the imaginary parts. The difference is an additional term for $\omega=\omega'=0$ for the covariance of the real parts.

Matt L.
  • 89,963
  • 9
  • 79
  • 179
  • Thanks a lot for your time. Can you please introduce me a reference that have already addressed this problem as well. Although I'd rather deal with one which takes integral in terms of measures rather than relying on Riemann integral everywhere?? – Cupitor Jul 10 '14 at 10:41
  • 1
    @Cupitor: You're welcome. Unfortunately, the only reference I know about is a paper about the D(T)FT of windowed noise. It also doesn't consider the covariances that you're interest in. You can find the paper here: http://users.ece.gatech.edu/mrichard/DFT%20of%20Noise.pdf – Matt L. Jul 10 '14 at 10:45
  • @MattL., So you calculated these things from scratch for the first time today? :D – Cupitor Jul 10 '14 at 10:46
  • 1
    @jojek: Oh well, I felt like shuffling around some integrals ... :) – Matt L. Jul 10 '14 at 10:46
  • 1
    @Cupitor: Yes, but the techniques are pretty standard. I learned them when I studied DSP and probability theory. So I didn't really invent anything new here, unfortunately ;) – Matt L. Jul 10 '14 at 10:48
  • Hehe!:D thats true! If you don't mind I would be happy if you acknowledge this as well: From your results it holds that: $E{X^(\omega)X^(\omega')}=2\delta(\omega+\omega')S_x(\omega)$ and as a result one gets: $E{\text{Re}{X(\omega)}\text{Im}{X(\omega')}=0$ right? – Cupitor Jul 10 '14 at 10:54
  • 1
    @Cupitor: Yes, you just miss a factor of $\pi$ in the first equation. – Matt L. Jul 10 '14 at 11:09
  • @MattL., I am playing around with this stuff and there is a serious problem here. Your answer implies that stationary signal spectrum is infinite at each point. Is this because the we dont average over time and thre time period is infinitely wrong? – Cupitor Jul 12 '14 at 13:21
  • @Cupitor: What exactly do you mean by "infinite at each point"? Which function? – Matt L. Jul 12 '14 at 13:55
  • @MattL., I am sorry not at each point. I mean at any diagonal point ($\omega\neq \omega'$ or $\omega\neq-\omega'$) its infinity. Obviously it is zero at off diagonal points. But I have hard time understanding what does it mean for covariance to be infinity when the frequencies coincide or the negative of them coincide. – Cupitor Jul 12 '14 at 14:33
  • 1
    @Cupitor: It means that $\text{Re}{X(\omega)}$ and $\text{Re}{X(\omega')}$ are orthogonal for $\omega\neq\pm\omega'$ (and the same is true for the imaginary part), and that the power of $\text{Re}{X(\omega)}$ and $\text{Im}{X(\omega)}$ is infinite. If this makes a lot of sense is another question because normally $X(\omega)$ doesn't even exist. The $\delta$ impulses (and, consequently, the infinite powers) occur because you consider infinitely long signals. If you were to do the same thing with windowed signals, then you would get finite values for the powers. – Matt L. Jul 12 '14 at 15:36
  • @MattL., I exactly guessed the same thing that its because infinite time series. But what do you mean that normally $X(\omega)$ doesn't exist?! Because at each frequency the covariance is infinity? – Cupitor Jul 12 '14 at 15:41
  • 1
    @Cupitor: Because the integral $X(\omega)=\int_{-\infty}^{\infty}x(t)e^{-j\omega t}dt$ (i.e., the Fourier transform) usually doesn't exist for random $x(t)$. That's why they invented the power spectrum. – Matt L. Jul 12 '14 at 15:47
  • @MattL. This makes me more confused. I asked another question about power spectrum. I would be happy if you could take a look at it. http://dsp.stackexchange.com/questions/17322/power-spectrum-defnition – Cupitor Jul 12 '14 at 16:04
  • 1
    @Cupitor: I just took a look, and may answer it as soon as I have the time to do so. Checking the wiki entry for power spectrum, I came across a footnote which exactly resembles Eq. (2) of my answer, so some other people obviously did the same weird calculations and, luckily, came up with the same result! http://en.wikipedia.org/wiki/Power_spectrum#cite_note-12 – Matt L. Jul 12 '14 at 16:27
  • @MattL., interesting. I actually missed that note. Thank you for pointing out. I would be grateful for an answer to my other question. I think $S_{xx}$ should have a limit with $\frac{1}{T}$ in stochastic case but I am not sure about it. Also in the case that the real signal is discrete, your calculations show again that the variances are infinite, but I think complete random discrete series exist. That should be explained in some other way, or maybe there is some problems with calculations when its discret?? – Cupitor Jul 12 '14 at 16:33