17

I'm trying to understand how the PSD is calculated. I've looked in a few of my Communication Engineering textbooks but to no avail. I've also looked online. Wikipedia seems to have the best explanation; however, I get lost at the part where they decide to make the CDF (Cumulative Distrubution Function) and then for some reason decide to related that to the autocorrelation function.

I guess what I don't understand is, how does autocorrelation having anything to do with calculating the PSD? I would've thought that the PSD simple be the Fourier Transform of $P(t)$ (where $P(t)$ is the power of the signal with respect to time).

user968243
  • 757
  • 2
  • 7
  • 12
  • How do you define $P(t)$? – Phonon Mar 09 '13 at 06:31
  • I don't really define it as anything. It's just some power signal. I guess if I had to define it, it'd be $P(t) = v(t) \cdot i(t)$... I guess the point is that the PSD is not $\mathcal{F}{P(t)}$ and it has something to do with autocorrelation and I don't get what... – user968243 Mar 09 '13 at 06:36
  • You can't really define power like that for arbitrary signals. There are no voltage and current concepts. Power in this case is defined as in power of a wave (electromagnetic if you like). So it's $\frac{1}{T} \int_0^Tx^2(t)dt$, and it's a single number, not a time-varying quantity. – Phonon Mar 09 '13 at 06:53
  • Okay. Would you be apple to explain why autocorrelation can be used to calculate the PSD? I don't really get how they go from $ \lim_{T \to \infty} \frac{1}{2T} \int_{-T}^{T} x^2(t),dt $ to $ \lim_{T \to \infty} \frac{1}{2T} \int_{-T}^{T} x(t) \cdot x(t + \tau),dt $ I don't really understand the point of auto-correlating and how that seems to seemingly magically help calculate the PDF. – user968243 Mar 09 '13 at 07:46
  • 1
    Read about the Wiener-Khinchin Theorem. You are refusing to understand what Phonon is pointing out to you that the limit that you are calculating is a constant and so its Fourier transform is just an impulse at $f=0$ in the frequency domain. If that floats your boat, go for it but it is not the power spectral density as everyone else understands it. – Dilip Sarwate Mar 09 '13 at 13:36
  • 2
    I have read about that theorem... And I get how it relates the Fourier transform to autocorrelation. And I'm not refusing to understand what Phonon said... I understand exactly what @Phonon said. What I don't understand is why the autocorrelation formula is used and I also don't understand why the fourier transform way is used (to get the PSD you can take the fourier transform, take the magnitude of it, square it etc.)... I have no idea why doing that would give a PSD and I haven't been able to find a decent derivation. – user968243 Mar 09 '13 at 14:13
  • There are signals for which the Fourier trnasform is undefined, so you can't just find the Fourier transform and proceed to compute the PSD. For those types of signals, you must approach the PSD using the autocorrelation function. – user2718 Mar 10 '13 at 02:23
  • I guess I'd just like to see a derivation of how the formula for calculating the PSD come about... – user968243 Mar 10 '13 at 04:20

2 Answers2

21

You are right, PSD has to do with calculating the Fourier Transform of the power of the signal and guess what.....it does. But first let's look at the mathematical relationship between the PSD and the autocorrelation function.

  1. Notations:

    • Fourier Transform: $$ \mathcal{F}[ x(t)] = X(\omega) = \int_{-\infty}^{\infty} x(t)e^{-j\omega t}dt $$
    • (Time) Auto-Correlation Function: $$ R(\tau) = x(\tau) * x(-\tau) = \int_{-\infty}^{\infty} x(t)x(t + \tau)dt $$
  2. Let's prove that the Fourier Transform of the Auto-Correlation function does indeed equal to the Power Spectral density of our stochastic signal signal $x(t)$.

$$ \mathcal{F}[ R(\tau)]= \int_{-\infty}^{\infty} R(\tau)e^{-j\omega \tau}d\tau $$ $$ = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} x(t)x(t + \tau) e^{-j\omega \tau} dtd\tau $$ $$ = \int_{-\infty}^{\infty}x(t) \underbrace{\int_{-\infty}^{\infty} x(t + \tau) e^{-j\omega \tau} d\tau}_{\mathcal{F}[x(t + \tau) ] = X(\omega)e^{j\omega t}} dt $$ $$ = X(\omega) {\int_{-\infty}^{\infty}x(t)e^{j\omega t} dt} $$

$$ = X(\omega)X^*(\omega)= |X(\omega)|^2 $$


What does it all mean? Note: This explanation is a bit "hacky". But here it goes

The Fourier transform tells us the spectral components of a signal. In our case, the signal is Stochastic; So, trying to calculate the spectral components of the signal will be pointless because, for every realisation of the random process, you will have different expressions for $ \mathcal{F}[x(t)]$.

What if you take the Expected Value of the Fourier transform then? This wouldn't work. Let's take a zero mean signal for example.

$$ \mathbb{E}\{ \mathcal{F}[x(t)] \} = \mathcal{F}[\mathbb{E}\{ x(t) \}] = 0$$

Instead, what if you take the Fourier transform of the square of the signal. $$ \mathbb{E}\{ \mathcal{F}[x^2(t)] \} = \mathcal{F}[\underbrace{\mathbb{E}\{ x^2(t) \}}_{\text{Av. Power of the Signal}}] $$

The autocorrelation function is essentially the $P(t) $ which you were alluding to.

References:

[1] Communications 1, P-L. Dragotti, Imperial College London

[2] White Noise and Estimation, F. Tobar [Unpublished Report]

ssk08
  • 708
  • 7
  • 9
  • Fantastic Explanation! A small calculus question - are you able to interchange the $dt$ and the $d\tau$ inside the double integrals, only because their limits are both from -$\infty$ to +$\infty$? – Spacey Mar 12 '13 at 01:15
  • yes that's right. – ssk08 Mar 12 '13 at 01:30
  • Okay, I think that I kind of get it. I can see how the fourier transform is related to autocorrelation. I don't really understand, though, what the issue is with taking the fourier transform of $x(t)$ or $x^2(t)$. I don't really see why the expected value needs to be taken (I know it averages it, but I don't know why that is necessary) and I don't really understand what you mean by 'for every realisation of the random process, you will have different expressions for'.

    If you could elaborate a little, that'd be great!

    Thank you for your time!

    – user968243 Mar 12 '13 at 13:17
  • 1
    @user968243 As far as the "for every realization" part, think of it this way: Your original signal, lets say length $N$, that you want to find the PSD for, is a random vector. So it is a vector with $N$ components. Now, since this is a random vector, every time you 'roll the dice', you get different values for its components. One possibility might be [3 4 1 9 ...]. Another possibility might be [2.9 4.2 1.1 9.02...]. This is what he means when he says, "For every realization of a random process, (your vector), you get different expressions for" (the fourier transform. Make sense? – Spacey Mar 12 '13 at 14:57
  • @Mohammad summed it up perfectly. – ssk08 Mar 12 '13 at 15:57
  • Okay, so since the signal is random, we have to find the expected value; that makes sense. What is it, though, that makes the signal stochastic (non-deterministic)? Is it simply because PSDs are often used to view random signals such as noise? Thank you for your help! – user968243 Mar 13 '13 at 09:40
  • @Spacey for the small calculus question, no it's because $x(t)$ does not depend on the variable of integration, but $x(t+τ)$ does so it must remain inside the dτ and both of them however must be inside the $dt$ because they both depend on $t$. The rearrangement performed preserves that, so it is allowed. The reason why it is $X^*(\omega)$ is because $x(t)$ is real here and the positive oscillatory factor results in a frequency domain flip about the x axis. This is the same as the complex conjugation of the frequency domain if the time domain is real – Lewis Kelsey Oct 29 '20 at 17:37
8

Nice derivation but I think you can do this even easier

Auto correlation $r(t) = x(t)*x(-t)$, it's the convolution of the signal with it's time flipped self.

Convolution in the time domain is multiplication in the frequency domain.

Time flip in the time domain is "complex conjugate" in the frequency domain.

Hence we get $$ R(\omega) = \mathcal{F}\{r(t)\} = \mathcal{F}\{x(t)\}\mathcal{F}\{x(-t)\} = X(\omega) X^*(\omega) = |X(\omega)|^2 = PSD $$

ssk08
  • 708
  • 7
  • 9
Hilmar
  • 44,604
  • 1
  • 32
  • 63