Context:
I have a single receiver that is receiving multiple signals. Each signal has a few strong harmonics, but I don't know the fundamental frequency. I would like to display them on a spectrogram like display, but when the number of signals increases beyond three or four, the spectrogram gets very cluttered.
In other words:
$$ | X_{1}(f) + X_{2}(f) + \cdots + X_{N}(f) |^2 = |X_{1}|^2(f) + |X_{2}|^2(f) + \cdots + |X_{N}|^2(f) + \text{cross-terms} $$
Question:
Is there another time-frequency distribution that eliminates or greatly reduces these cross-terms, and is also computable on a digital computer in a reasonable time [ideally less than O(N^2)]?
Comments:
I know other distributions exist, e.g. the Wigner-Ville distribution, but generally, they seem to try and improve the resolution of the spectrogram at the expense of greater cross-terms. I would be happy with the opposite -- less cross-terms but also less resolution.