2

What "are" they? What's a sensible way to interpret the coefficients (and what isn't)? To pose specifics:

  1. DFT coefficients describe the frequencies present in a signal
  2. They describe the sinusoidal frequencies of the source

Are either of the above true?

OverLordGoldDragon
  • 8,912
  • 5
  • 23
  • 74
  • 2
    I provided a linear algebraic interpretation in this answer to a related question. – Joe Mack Sep 17 '20 at 23:05
  • 1
    Perhaps you will find this a more tangible physical interpretation for real valued signals: dsprelated.com/showarticle/768.php – Cedron Dawg Sep 18 '20 at 01:55
  • Alright I botched this - see comments under Dan's answer. I'd delete the Q&A but others have answered -- I might put together an "actual" answer sometime later, unless someone gets the idea of what I seek in such an answer and answers. – OverLordGoldDragon Sep 18 '20 at 13:32
  • @CedronDawg I've never seen or thought of the complex exponential in DFT in terms of roots of unity, it's an interesting perspective; does that (or other article) make use of that explicitly to explain forward/inverse transforms? If so I'll give them a full read – OverLordGoldDragon Sep 18 '20 at 13:37
  • 1
    Whether you want to skim, read, or study them doesn't matter much to anyone but yourself. They are there, available, and will remain so. The interpretation given in the article doesn't really apply to the inverse very well even though they are mathematically equivalent (sans the normalization). Every bin is at a Root of Unity by definition, it is inherent. The article makes it explicit. Nearly all my articles are novel. FYI, none of the down votes are mine, and I would strongly advise you to stop trying to use your voting power as a control tool. – Cedron Dawg Sep 18 '20 at 13:46
  • 1
    @CedronDawg What 'control tool'? I didn't ask for any changes. Don't see what's with your accusations lately – OverLordGoldDragon Sep 18 '20 at 14:27
  • 1
    I didn't see the latter comments below until after my comments. Nothing I have said is meant to be an "accusation". However, since you brought the word up, I also think styling your online persona on "The Accuser" is also a bad idea. Friendly advice. – Cedron Dawg Sep 18 '20 at 14:37
  • 1
    @CedronDawg You've been called pedantic and a demagogue by others and didn't revolt nearly as much - suppose it's some beef with the supposed 'impostor' that is I. Best we focus on Q&A contents. – OverLordGoldDragon Sep 18 '20 at 14:40
  • 1
    Maybe, but we got the chat prompt again. You can still email me if you feel the need. Some things are better discussed in private. – Cedron Dawg Sep 18 '20 at 14:55
  • 1
    @CedronDawg roots of unity being at different frequencies... Do you have any speacial interpretation of this? Btw, this OverLord guy is probably the most interesting! OP ever... :-)) He doesn't ask for learning, he ask for testing... ;-) as mostly he puts his own answer :-) – Fat32 Sep 21 '20 at 20:14
  • @Fat32 Why, thank you - but I assure you all but this question (initially) are for learning, and I always welcome complementing answers. I've likewise wondered of some useful visualization of these unity roots, but haven't dug there much yet. – OverLordGoldDragon Sep 21 '20 at 20:22
  • 1
    hmm good luck in your quest... ;-) – Fat32 Sep 21 '20 at 20:42
  • @Fat32 It actually is a quest, one I hope to finish within a week (will be clear when) - but think the community here is interesting enough that I'll stick around afterwards. – OverLordGoldDragon Sep 21 '20 at 20:54
  • @Fat32 I find him a technocratic idealist like his namesake. Ziggy Marley does a great 75th tribute to his dad, you should both listen to the lyrics carefully, or not. Roots, rock, reggae. – Cedron Dawg Sep 21 '20 at 22:00

3 Answers3

4

At it's most fundamental, the DFT is about fitting a set of basis functions to a given set of sampled data. The basis functions are all sinusoidal functions, expressed as the complex exponential with a purely imaginary exponent. Using the most common scaling convention each basis function, without its scaling coefficient, is:

$$ g_k[n] \triangleq \tfrac1N e^{j \omega_k n} $$

For the DFT:

$$ \omega_k = \frac{2 \pi k}{N} $$

and the basis functions add up as

$$\begin{align} x[n] &= \sum\limits_{k=0}^{N-1} X[k] \ g_k[n]\\ \\ &= \sum\limits_{k=0}^{N-1} X[k] \ \tfrac1N e^{j \omega_k n} \\ \\ &= \tfrac1N \sum\limits_{k=0}^{N-1} X[k] \ e^{j \frac{2 \pi k}{N} n} \\ \end{align}$$

It's real easy to solve for the coefficients:

$$\begin{align} X[k] &= \sum\limits_{n=0}^{N-1} x[n] \ e^{ -j \frac{2 \pi k}{N} n} \\ \end{align}$$

robert bristow-johnson
  • 20,661
  • 4
  • 38
  • 76
  • 2
    This neither explains anything nor addresses (1) or (2) – OverLordGoldDragon Sep 18 '20 at 12:38
  • 2
    $X[k]$ are the coefficients to each basis function in the linear summation. That's all that they are. Nothing else. The periodicity is obvious because each basis function is periodic with period $N$ so then any sum of the basis functions are also periodic with the same period. – robert bristow-johnson Sep 18 '20 at 17:42
  • 2
    Perhaps one thing you might be missing is assigning the effect of windowing to the DFT. It's not the DFT that is windowing your data. It's the windowing that you do to the data before it is passed to the DFT. What the DFT does, by fitting those $N$ data values to the DFT basis functions (all of which are periodic with period $N$), is periodically extends the $N$ samples passed to the DFT. It always does that and it is inherent to the DFT. – robert bristow-johnson Sep 18 '20 at 17:47
  • 1
    Perhaps the purpose of the Q&A isn't entirely clear; I sought to clear up some misconceptions about the coefficients. I thought I already had an answer - and I pretty much do for the real-valued input case, but since that's only a special case, it doesn't count, as my goal's to explain things at the fundamental level. I now seek the same level of intuition for the complex case, and I'll get there with time invested and may update my answer later. – OverLordGoldDragon Sep 18 '20 at 21:34
  • 1
    Well, I can't always discern nor account for what is or is not "entirely clear" to someone else. I have once asked and answered a question myself (because I was spoiling for a fight). It doesn't always get perceived the way it was intended. But I spelled out, strictly speaking, what the DFT coefficients are about. They the weighting coefficients for a set of basis functions that are "sinusoidal". – robert bristow-johnson Sep 19 '20 at 02:10
  • Huh, didn't know you can have different avatars across networks. And good Q&A, put on my list. – OverLordGoldDragon Sep 19 '20 at 21:10
  • @robertbristow-johnson I don't think of complex exponentials as "sinusoidal" to the same extent and for the same reason I would say the sum of two sinusoids of different frequencies are not sinusoidal-- they have sinusoidal components but that doesn't mean the result itself is a sinusoid. Sinusoidal to me is the shape of a sinusoid not that the signal is made up of sinusoids (which all real single-valued anaytic signals are, but we don't call then sinusoidal either). – Dan Boschen Sep 21 '20 at 15:13
  • Well, @DanBoschen, you don't have to think of complex exponentials as "sinusoidal". But what I said was "complex exponential with a purely imaginary exponent". Note the use of singular reference, not plural. I think Leonhard Euler and I agree that the "complex exponential with a purely imaginary exponent" is "sinusoidal". Most certainly this $$ g_k[n] \triangleq \tfrac1N e^{j \omega_k n} $$ is sinusoidal. – robert bristow-johnson Sep 21 '20 at 17:44
2

To be clear, DFT coefficients do NOT give the amplitude and phases of real sinusoidal components of the original signal unless the signal itself is real, but rather give the amplitude and phase of the exponential frequency components scaled by $N$, which are given in the form of $c_ke^{j\omega_k n}$ and referred to as "complex sinusoids".

The sum of complex weighted exponentials is directly from the inverse DFT formula as given below:

$$x[n] = \frac{1}{N}\sum_{k=0}^{N-1}X[k]e^{j \omega_k n}$$

Where each $X[k]$ is given as:

$$X[k] = \sum_{n=0}^{N-1}x[n]e^{-j \omega_k n}$$

Showing how each sample is restored as the sum of all the properly weighted and phased frequency components each in the form of $e^{j \omega_k n}$.

The generalized expression $Ae^{j\phi}$ is a phasor with magnitude $A$ and angle $\phi$. Thus every coefficient in the DFT is a complex number that represents the magnitude and starting phase of a complex phasor in time that rotates at an integer multiple of the fundamental frequency, which is given by the inverse of the total time duration of the time-domain waveform (similar to the continuous-time Fourier Series expansion).

Stated another way, the inverse DFT reconstructs any arbitrary time-domain sequence of samples both real and complex from a set of basis functions of the form of $e^{j\omega_k n}$, and the DFT maps any arbitrary time-domain sequence of samples both real and complex into the components of those basis functions (showing how much of each basis function is contained in the time domain signal and it's phase relationship to all the other components).

To give this visual meaning, with the OP further clarified in the comments is desired, consider this: We can select any arbitrary $N$ samples throughout a complex plane representing a discrete complex time-domain waveform, such that we sequence through each of those samples in turn. The DFT amazingly will return to us the magnitude and starting phase for $N$ vectors, with each vector rotating an integer number of cycles starting with 0 (no rotation) up to $N-1$ cycles around the unit circle, such that if we add all these spinning phasors (and divide by $N$), the end point of this spinning geometrical contraption will pass through every time domain sample exactly at the correct moment in time.

To immediately visualize this, consider the simplest case of a 2 point IDFT which would result in

$$x[n] = \frac{1}{N}\sum_{k=0}^{N-1}X[k]e^{j \omega_k n} = \frac{X_0}{2}e^{j0}+\frac{X_1}{2}e^{j\omega_1 n}$$

where $\omega_1$ is 1 rotation and is depicted by the following graphic.

2 pt IDFT

Consider if we chose as time domain samples $[1,0]$: The DFT result is $[1,1]$ representing a phasor of magnitude $1$ and angle $0$ that doesn't rotate, added to a phasor of magnitude $1$ that rotates one cycle (just as depicted in the graphic above), so at both samples in time the above graphic will be at $[2,0]$, and after dividing by $N$ is our original sequence.

Consider if we chose as time domain samples $[1,1]$: The DFT result is $[2,0]$ representing a phasor of magnitude $2$ and angle $0$ that doesn't rotate, added to a phasor of magnitude $0$, so at both samples in time the result will be at $[2,2]$, and after dividing by $N$ is our original sequence.

Finally consider if we chose as time domain samples $[1+j1,-1+j1]$: The DFT result is $[2j, 2]$ representing a phasor of magnitude $2$ and angle $\pi/2$ that doesn't rotate, added to a phasor of magnitude $2$ and angle $0$ that rotates one cylce (as depicted in the graphic below), so at both samples in time the result will be at $[2+j2, -2+j2]$, and after dividing by $N$ is our original sequence.

IDFT for j 1

Dan Boschen
  • 50,942
  • 2
  • 57
  • 135
  • 1
    A complex exponential's real and imaginary parts are sinusoids, which is what I meant, and can clarify explicitly if needed. – OverLordGoldDragon Sep 18 '20 at 12:39
  • 1
    yeah an important point, and that signals can be complex (so not sinusoids at all!) – Dan Boschen Sep 18 '20 at 12:40
  • Sure, but my two bolded definitions in the answer still apply - though I admit the intuitions mainly address real-valued signals, but it wasn't meant to be an exhaustive coverage anyway. – OverLordGoldDragon Sep 18 '20 at 12:44
  • I don't think they do; in fact I would argue they don't-- that is not what the DFT equation states, so it's quite misleading-but I come from a technical world where the use of complex signals is very prevalent so it's more in the fore-front of my thinking. But once you cross the bridge to complex signals, the whole of signal processing becomes much more intuitive. Thinking in sinusoids is actually not very helpful for a lot of the constructs, including the DFT. – Dan Boschen Sep 18 '20 at 13:04
  • Alright, you made me think - and you're right; I completely overlooked complex-valued inputs. My amplitude-phase perspective only directly applies on real-valued inputs thanks to imaginary coeff symmetry, but my goal was to explain the fundamental, not a special case, and I agree my description is quite incomplete, and partly misleading. The coefficients still do encode amplitude and phase information of real and imaginary components, so that isn't "wrong", but indeed that's not the best description (also computed fairly differently). – OverLordGoldDragon Sep 18 '20 at 13:23
  • Still, my answer is nicely applicable to real signals, and clear much confusion about what the "spectrum" is supposed to represent - don't think it was fair to sink the answer all considered, but I admit I ultimately failed to deliver what I sought to. – OverLordGoldDragon Sep 18 '20 at 13:25
  • If you edit your answer I'll retract my downvote. – OverLordGoldDragon Sep 18 '20 at 13:26
  • What did yo want me to edit? Also the text you reference completely excludes complex signal processing as well and actually giving similarly misleading comments such as the idea of "positive frequency sinusoids" so have caution when reading with that context in mind. There is otherwise a lot of otherwise good content but I think misses the opportunity to really understand what is going on -- the author mentioned to me he doesn't come from the complex signal processing world but will hopefully be updating this in a future addition. – Dan Boschen Sep 18 '20 at 13:38
  • 1
    Completely agreed about the text, and I'm cautious with it, but that one key idea slipped my mind entirely as I became overly focused on real-valued inputs. And, nothing in particular to edit, just Stack Exchange won't allow me to change the vote unless an edit is made - so can insert a space or something. – OverLordGoldDragon Sep 18 '20 at 13:40
  • Dan, i surely dispute this: " then to be clear DFT coefficients do NOT give the amplitude and phases of sinusoidal components of the original signal unless the signal is real..." DFT doesn't care about if the original signal is real. take your original signal $x[n]$ and multiply it by $j$ and now it's purely imaginary. yet the DFT will return amplitudes and phases of the sinusoidal components that make up the signal. – robert bristow-johnson Sep 21 '20 at 18:14
  • and Dan, unlike Ced or BobK, i have a helluva lotta respect for you. yer not a self-important blowhard. – robert bristow-johnson Sep 21 '20 at 18:16
  • 1
    @robertbristow-johnson and I am kind of bummed that Ced and BobK don't respect me :( – Dan Boschen Sep 21 '20 at 18:18
  • they don't respect anybody but themselves (and now each other, because they find it convenient). BobK has fucked up many a wikipedia article and Ced was saying some of the same crap in the USENET comp.dsp where, at least @PeterK and Olli and me came from. They're hacks. They self-taught themselves some stuff that my own engineering profs warned me against (both about units and about the DFT) and they are trying to propagate their neophyte understanding. – robert bristow-johnson Sep 21 '20 at 18:24
  • @Robert I sense some hard feelings there----exhale, let me know your thoughts on the sinusoid so I can get my own head straight. I think I probably am just not using the right definition of sinusoid and am open to that. – Dan Boschen Sep 21 '20 at 18:27
  • the sequence [1, -.2, -.1, -.5, 5, 3, 5, 7, -1] is not sinusoidal, but the Fourier Transform (of any variety) does not offer to make it sinusoidal, but offers to find a series (or "summation") of sinusoids to fit that sequence. if it's the DFT (as opposed to the continuous FT) it's the summation of a finite set of sinusoids, all of which can be said to have a period of $N$, in this case $N=9$. so then the sum is also periodic with the same period. – robert bristow-johnson Sep 21 '20 at 18:28
  • @robertbristow-johnson right- that I get. I was using the analogy that I wouldn't call that sinusoidal even though it clearly has sinusoidal components. So was thinking the argument for $e^{j \omega t}$ to be sinusoidal isn't that it consists of two sinusoidal components. So what then is the right definition of "Sinusoidal" if not appearing like a sine wave (which I argue $e^{j \omega t}$ does not, it is its real and imaginary components that do). Do you see my quandary?? – Dan Boschen Sep 21 '20 at 18:31
  • "... $e^{j \omega t}$ as a sum of two sinusoids (real and imaginary) but likewise the result isn't sinusoidal even though the components are." But Dan, the two sinusoids have exactly the same frequency. add two sinusoids of exactly the same frequency and you get a single sinusoid of exactly the same frequency. the amplitude and phase might be different, but it continues to be a sinusoid. – robert bristow-johnson Sep 21 '20 at 18:32
  • @robertbristow-johnson if they are both real or both imaginary that is true. Here we have one is real and the other is imaginary and the result is the magnitude is constant. (See-- perhaps the magnitude being shaped like a sinusoid is not what a sinusoid make ---if Yoda knew DSP) – Dan Boschen Sep 21 '20 at 18:34
  • $$ A \cos(\omega t) + B \sin(\omega t) = \sqrt{A^2 + B^2} \cos(\omega t + \phi)$$ where $\phi = \arg { A -jB }$ unless i fucked up a sign here. What happens if $A=1$ and $B=j$? – robert bristow-johnson Sep 21 '20 at 18:34
  • unless i fucked up the minus sign (i gotta go look it up), the mathematical identity continues to be valid. – robert bristow-johnson Sep 21 '20 at 18:38
  • 1
    @DanBoschen it's not just the $\arctan\left(\frac{-B}{A} \right)$. that is good only for $A>0$. there is that quadrant problem and you need that atan2() function, which is what i was saying. but what i am still worried about (i gotta blast out the math) is the minus sign. – robert bristow-johnson Sep 21 '20 at 19:23
  • and it doesn't assume $A$ and $B$ are real. – robert bristow-johnson Sep 21 '20 at 19:25
  • @robertbristow-johnson oh yes I see arg(A+jB) then according to my graphic. – Dan Boschen Sep 21 '20 at 19:26
  • did i fuck up the minus sign? – robert bristow-johnson Sep 21 '20 at 19:26
  • i just went over it. and i did not fuck up the minus sign. i don't think so. so that trig identity can be made to be true even if $A$ and $B$ are not purely real. – robert bristow-johnson Sep 21 '20 at 19:31
  • @robertbristow-johnson So for your example A=1 B=j it would come out to $\sqrt{1+j}cos(\omega t + \phi)$, which would have a magnitude that goes up and down sinusoidally rotated by the complex phase and scaled by the complex magnitude of $\sqrt{1+j}$--- that doesn't match the result of my graphic – Dan Boschen Sep 21 '20 at 19:34
  • you see $$ \cos(z) = \frac{e^{j z}+e^{-j z}}{2} $$ and $$ \sin(z) = \frac{e^{j z}-e^{-j z}}{2j} $$ even if $z$ is complex – robert bristow-johnson Sep 21 '20 at 19:36
  • Ah I see--- due to $\phi$ ending up being complex? I don't know what frac means? – Dan Boschen Sep 21 '20 at 19:39
  • @DanBoschen, sorry, but i *never* do the chat thing here. i really don't like nor agree with shunting these discussions off to the chat. – robert bristow-johnson Sep 21 '20 at 19:44
  • the cosine of a purely imaginary argument does not appear to be sinusoidal even though we call it a "sinusoidal function". but the exponential of a purely imaginary argument does appear to be sinusoidal even though we don't call call it a "sinusoidal function". – robert bristow-johnson Sep 21 '20 at 19:52
  • @robertbristow-johnson well we do call it a "complex sinusoid" and so your expression that shows it as a single equation $Acos(\theta)$ with complex $A$ and complex $\theta$ almost clears that all up (if it weren't for my cosh extreme-- but maybe that is just like the extreme ($\cos(0)$ being no longer "sinusoidal?? Is the latter part of my answer above worthy of asking as a new question so that you can provide the clean answer there? (and we can delete this long thread to make PeterK happy?) – Dan Boschen Sep 21 '20 at 19:59
  • let @PeterK express his unhappiness before deleting. – robert bristow-johnson Sep 21 '20 at 20:01
  • Oh my, the edits and chat - if they don't happen to nail what I'm looking for I'll at least be sure to clarify sufficiently. – OverLordGoldDragon Sep 21 '20 at 20:23
  • @OverLordGoldDragon yes I am now confused about what the true definition of "sinusoidal" is, and specifically if $e^{j \omega t}$ which is composed of real and imaginary sinusoids is in fact entirely on its own a singular "sinusoid". I don't think this was your question specifically so tempted to move the last part of my answer to a question on its own and let the intelligentsia provide clear responses there to just that question alone..... – Dan Boschen Sep 21 '20 at 20:27
  • @DanBoschen Caught up, mostly; yes, a dedicated Q&A is much appropriate. As to what I seek with my question; a visual intuition just like in my answer. I can easily interpret the coefficients as phases, amplitudes, and frequencies of sinusoids that, when added, yield the original signal, as well as the cross-correlation in the forward transform to get the coefficients. This easily explains "spectral leakage" in a DFT with $f=1.1$ without invoking DTFT and other stuff. So it's this idea generalized to a complex signal. – OverLordGoldDragon Sep 21 '20 at 22:28
  • There might not be much more to this than what I already wrote, or there might, don't know. – OverLordGoldDragon Sep 21 '20 at 22:28
  • I'll move that to a new question when I have some more time – Dan Boschen Sep 21 '20 at 22:47
  • Fat16++, Cycles Per Frame * Samples Per Second / Samples Per Frame = Cycles Per Second

    $$ \frac{k \cdot F_s}{N} = f $$

    $$ \frac{\text{None} \cdot \text{Time}^{-1}}{\text{None}} = \text{Time}^{-1} $$

    But I'm just a hobbyist, no cred. I dance so badly, it clears the floor and frightens small children.

    – Cedron Dawg Sep 22 '20 at 09:11
  • So, see comments. Unsure what exactly I was doubting, but I was right, and it's trivial to see. What is $(A + jB) e^{j\theta}$? --> $\sqrt{A^2 + B^2} \cos{(\theta + \arg{(A + jB)})}$. Amplitude and phase of a complex sinusoid. So instead of just summing real sinusoids, we're also summing imaginary, and my explanation remains useful. There might be a prettier, richer explanation, but everything I describe in my answer still applies. This site may be even worse than SE.Physics in terms of downvotes - what nonsense. – OverLordGoldDragon Sep 25 '20 at 05:07
  • I did end up making one substantial edit, which could've been pointed out with a brief comment; the original answer already did all the heavy lifting. – OverLordGoldDragon Sep 25 '20 at 06:11
  • Found literature using "sinusoid" to denote $e^{j \omega t}$, linked in my answer. – OverLordGoldDragon Nov 26 '20 at 13:09
1

Note: Originally I failed to mention the "complex" clarification to follow (and at bottom), hence the downvotes (which now are undue). The coefficients give amplitude and phase of a complex sinusoid. So instead of just summing real sinusoids, we're also summing imaginary (to a nonzero).

This terminology is also used ubiquitously in Wavelet Tour, with one line reading near-identical to mine (pg 26):

enter image description here


(1) - no. (2) - no. But not entirely no.

The truest meaning stems from the definition of forward and inverse transforms, but it won't directly answer all pertinent questions; latter is a matter of what said definitions imply. The formulation I find most intuitive stems from the inverse:

DFT coefficients describe the sinusoids that, when added, yield the original signal. -- In detail:

DFT coefficients, $X_k$, give amplitudes and phases of sinusoids at integer frequencies $k$, from $0$ to $N-1$, that sum to the original signal $x[n]$, comprised of $N$ points.

That's all. This is the one description that must hold no matter the context. (1) and (2) "turn out true" GIVEN certain criteria are met, but the coefficients themselves don't inherently warrant any of them.


Case 1: $f=1, 4,$ and $8$ adjacent, each lasting for 100 samples.

There are only three frequencies, so if the coefficients indeed describe "what frequencies are in a signal", what's with all these other frequencies?

"There are three peaks, one for each frequency; other values are due to sharp jumps between frequencies, insufficient samples for proper representation, and other imperfections."

Nonsense. There's nothing 'imperfect' about the coefficients; they simply don't directly answer for "which frequencies are in the signal". Instead, they give the amplitudes, phases, and frequencies of sinusoids that, when added, equal the original signal.

This is sometimes formulated as "Fourier Transform tells us which frequencies are in the signal, not when they occur". True, but it misses the point.


Case 2: $f=1$, 100 samples, with a 1 sample "dip".

What's with all the higher frequencies, even if they are relatively small? Surely this is noise, and not part of the 'actual' signal, and we should just pretend they're zero - right?

Wrong. These can very well be 100 different sources overlapping exactly as described by coefficient to produce this exact dip. It could also be noise. We don't know. Of course we can make rational assumptions about what's plausible, but this requires knowledge about the source system; how many sinusoidal sources are there, if any? What noise can we expect? The example above is trivial; if we know nothing else, it's almost certainly noise - but a real-world signal is far more intricate.

This addresses both (2) and (1). Not only "we don't know", but we can't know anything exactly the source system just by observing it (its "output"). It could be a magnetic pole spun at 1Hz, with an electrical shock inserted in the circuit at an instant. Or it could be 100 antennas synched just right. The end-result is identical, and it's the only thing the DFT operates on.


Case 3: $f=1$.

Now we know the source is a sinusoid of $f=1$, right? It has to be!

Nope. The source could very well be a bunch of triangle waves, square waves, a happy coincidence, or infinite other possibilities. But what if we know it's periodic? Even then: sinusoids are only one, not only, example of periodic orthogonal basis functions.

PS, the slight non-zero values in the coefficients in this case are due to 100 samples imperfectly representing an $f=1$ signal. It's also true in Case 1, but is far besides the point.


So when do the coefficients exactly represent the "frequencies in the signal", if ever?

Only one possibility; if the signal is:

  1. Comprised of sinusoids ...
  2. of the exact frequencies ...
  3. at the exact amplitudes ...
  4. at the exact phases that are given by the coefficients, ...
  5. each lasting over the entirety of the transformed signal, ...
  6. and nothing else.

In other words, if it does accurately describe (1), it only "happens to be the case" (i.e. coincidence). The more the source deviates from any of the six above, the less the coefficients meet (1) or (2).

If the DFT can't tell us anything about the source, why use it? -- Because most of the time we have some knowledge of the source, allowing us to assume some traits and deduce others (e.g. electric motors). Other times we don't care about the nature of the source, but only the signal and how we can manipulate it (e.g. recorded audio, power transmission).


What "is" the DFT? -- A transform. A mapping. An algorithm. Something that takes numbers, and returns other numbers, according to some rule. What it's not is a descriptor of the "true nature" of the signal.

Does DFT assume the input's periodicity? -- Nothing about the coefficients or the reconstructed signal change regardless what the signal is outside the portion that was transformed. The DFT describes only that portion, nothing else. The extension of the inverse, however, is periodic, and this becomes relevant in certain operations (e.g. below) - but again, no info on the original signal.

But DFT derivation from the Fourier Transform is only valid if input is periodic. -- The DFT neither requires nor assumes FT's existence; it is standalone. "From FT" is only one derivation, not sole. Where periodicity is relevant is when FT's properties are assumed to also apply to DFT.

In circular convolution, multiplying FT's coefficients exactly corresponds to time-domain convolution, but not DFT's; for DFT one must pad, which changes the coefficients. So in this sense, the interpretation of DFT coefficients changes with the periodicity assumption.


What about the meaning from the "forward transform" perspective? This is more about how we get the coefficients in the first place, and why they take on the values they take, which is its own topic. For start, it's better posed as a question:

Why does the forward transform yield the coefficients that describe the frequencies, amplitudes, and phases of sinusoids that sum to the (transformed) signal?

I defer to this excellent video and this text for exploring the answer.


NOTE: Answer uses real sinusoids to illustrate concepts for simplicity, which holds for real-valued inputs. However, this isn't the whole story, and may mislead, but we can apply the same concepts to the complete picture. The more complete definition adds one word:

DFT coefficients, $X_k$, give amplitudes and phases of complex sinusoids at integer frequencies $k$, from $0$ to $N-1$, that sum to the original signal $x[n]$, comprised of $N$ points.

For a real-valued input, due to symmetry of $X$'s imaginary components, the imaginary components of (coefficient-multiplied) complex sinusoids sum to zero. However, the imaginary component of the complex sinusoid (basis function) does not contribute zero; it contributes to the real component:

$$ (A + jB) e^{j\theta} = (A + jB) (C + jD) = (AC - BD) + j(AD + BC), \\ C = \cos{(\theta)},\ D = \sin({\theta}) $$

Thus, the real sinusoids used throughout the answer actually owe part to the imaginary component of the complex basis function, so it's not just summing the real component of the basis.

For a complex input, we're doing all the same summation, and can visualize it the same, except $(AD + BC)$ no longer ends up being zero.

OverLordGoldDragon
  • 8,912
  • 5
  • 23
  • 74