1

For standard gaussian white noise, the covariance matrix is a identity matrix. What about other coloured noises (generated from standard gaussian): Brown(red), Pink, Blue, Violet?

Additional details / thoughts: Preferably looking for a python code, since I don't have domain knowledge to understand the "process". Maybe generating 2D noise and then finding the covariance (numpy.cov) would work? I found https://stackoverflow.com/questions/67085963/generate-colors-of-noise-in-python to generate 1D noise, but I am not sure how to extend it to 2D.

Update for clarity: https://en.wikipedia.org/wiki/White_noise#White_noise_vector

the covariance matrix R of the components of a white noise vector w with n elements must be an n by n diagonal matrix, where each diagonal element Rii is the variance of component wi; and the correlation matrix must be the n by n identity matrix.

I want to find covariance matrix R of the components of coloured noise vector w with n elements.

Will generating enough noise vectors, stacking them as columns and finding covariance of the matrix gives the expected covariance matrix? This matches with what I expect from the covariance matrix: https://en.wikipedia.org/wiki/Estimation_of_covariance_matrices#Estimation_in_a_general_context

sdnemina
  • 13
  • 3
  • 1
    So I'm a bit confused by conflicting statements. 1. are you trying to generate noise, or to find the covariance matrix of noise that you already have? 2. Are we talking about covariance, a property that links two separate random variables, as indicated by your 2D considerations and the word "covariance matrix" or are we talking about autocovariance, property of a single time-dependent random variable, as indicated by "colored noise"? – really not Constantine A. B. Jan 26 '24 at 10:27
  • @ConstantineA.B. I am now a bit confused about the wording too :) . Maybe autocovariance matrix is what I am looking for. To be more clear let me quote wikipedia: "The covariance matrix R of the components of a white noise vector w with n elements must be an n by n diagonal matrix, where each diagonal element Rii is the variance of component wi; and the correlation matrix must be the n by n identity matrix." I want to find the covariance matrix for coloured noises (brown, pink, blue, violet). – sdnemina Jan 26 '24 at 17:00
  • but "colored noise" is 1-Dimensional, but changing over time not an an N>1 dimensional vector. You can of course have an N-dimensional vector of noise processes, but then each entry in that vector isn't just a random number, but a random function of time. – Marcus Müller Jan 26 '24 at 19:12
  • and whether or not these elements of the vector are related/correlated is not subject of the color of the noise at all. You can have two perfectly correlated white noise sources (e.g. X, Y=-X), or two perfectly independent white noise sources; and the same is true for every color of noise. Color tells you something about how a random variable with a time dependence relates with itself of the past, and a covariance matrix tells you something about how different random variables relate. – Marcus Müller Jan 26 '24 at 19:26
  • @MarcusMüller Yes, looks like I am not clear with the terminologies and concepts :). For a 1-D colour noise series, what I wanted the covariance matrix C to convey in C(i, j), was how noise at time step i of series relates to time step j of series. So incase of white noise C(i, j) = 0 when i is not equal to j. – sdnemina Jan 27 '24 at 08:03
  • 1
    you can put that in a Matrix, but it's going to be very "repeatingly", content-wise: The information you're looking for is the autocorrelation function, i.e., just a 1D construct, not a 2D construct like a matrix, because there's only one parameter to vary: the delay of the noise to itself for which you say how correlated they are. The good news is that for weak-sense stationary signals (and all "colored" noises you meet are usually modelled as such), the autocorrelation function is linked to the power spectral density (the "power" plots) by the Fourier transform. For example: Pink noise has – Marcus Müller Jan 27 '24 at 11:23
  • PSD of $\Phi_{NN}(f) = 1/f$, so the autocorrelation function is just the inverse Fourier transform of that, $\varphi_{NN}(\tau) = \mathcal{F}^{-1}{1/f}(\tau)$. Now, that transform isn't inherently easy to compute (and it's infinite in length; your "Matrix" would also become infinitely large). That's why I wonder what you need this matrix or vector description for. Seeing you're confused about a lot of things, I could imagine this not being overly useful to you, and you might rather want to ask a new question about what you wanted to do with that Matrix representation! – Marcus Müller Jan 27 '24 at 11:27
  • If you edit your question, be rewording it to fit a title "Do I need a 2D covariance matrix to describe a 1D noise process?" (and then changing the title), then the last two comments by @MarcusMüller become the answer. Usually we describe the statistics of a 1D random process with an autocorrelation function (or vector), and use a 2D correlation matrix to describe the statistics of a 2D random process (i.e., a process that outputs a 1D vector at each time step). – TimWescott Jan 27 '24 at 16:05
  • @MarcusMüller the correlation matrix contains the same data as the autocorrelation function if you assume the data is zero outside the averaging window. This will result in a biased estimate. An unbiased estimate will not have that same assumption, and therefore won’t be defined by the autocorrelation function. Additionally, to get a good PSD estimate, imo, you need a correlation matrix based estimator. – Baddioes Jan 27 '24 at 18:51
  • @Baddioes agreed on "ACF and finite AC matrix contain the same info only if the ACF is bounded in length", but the assumption that there's no correllation for large $\tau$ is quite antithetical to things like Pink and Brown Noise, were energy is concentrated at low frequencies (for pink noise, the power isn't even bounded at f=0!). It was probably not a good idea of me to call what one would need to be equivalent to the ACF (which is representable by an infinite-length vector) a "matrix" (it's a linear map, alright), without huge blinking warning signs. – Marcus Müller Jan 27 '24 at 18:59

1 Answers1

2

You can at least approximate colored noise by altering the spectrum of white noise. The following are some definitions of colored noise

\begin{equation} S_{pink}(f) = \frac{S_{0}}{f^{\alpha}} \end{equation}

where $\alpha$ is close to 1,

\begin{equation} S_{brown}(f) = \frac{S_{0}}{f^{2}}\end{equation}

\begin{equation} S_{blue}(f) = S_{0}f\end{equation}

You can see more here. I'm not a python person, so I'm not going to write python, but the basic procedure is something like: generate white noise $\rightarrow$ N-D FFT $\rightarrow$ N-D filter (e.g. for pink noise dot multiply the white noise spectrum by $\frac{1}{f}$) $\rightarrow$ N-D IFFT. This will give you an approximation of your desired colored noise.

For generating a covariance (correlation) matrix, the function you are likely looking for is spectrum's corrmtx, which is equivalent to Matlab's corrmtx. This will give you a correlation matrix estimate for 1-D data via time-averaging. However, you seemed confused on what the correlation matrix is and how to find it. For a data vector

\begin{equation}\underline{x}=\begin{bmatrix}x(0) & x(1) & \cdots & x(N-1)\end{bmatrix}^{T}\end{equation}

The correlation matrix is defined as

\begin{equation} R_{xx} = E\{\underline{x}\,\underline{x}^{H}\}\end{equation}

This means that the correlation matrix is the ensemble averaged outer product, and thus represents the complete set of statistical second moments of a data vector $\underline{x}$. Theoretically, for a vector of length $N$, the correlation matrix is an $N$-by-$N$ matrix. Estimating via time-averaging will typically decrease the size of the estimated correlation matrix.

Estimating the correlation matrix for 2-D data, for example, will result in a block diagonal matrix. Estimating correlation matrices of 2-D or higher dimensional data is pretty conceptually challenging, so I would start with 1-D and make sure you understand that.

EDIT 1: Question about Covariance Matrix vs. Correlation Matrix

Wikipedia's explanation doesn't seem right. The covariance matrix is defined as

\begin{equation}K_{xx} = E\{(\underline{x}-\mu_{x})(\underline{x}-\mu_{x})^{H}\}\end{equation}

where $\mu_{x}$ is assumed constant for a Wide-Sense Stationary random process. For noise to be white, it must be zero-mean. There are several stack exchange posts on this, I'll link this one. This means, for white noise,

\begin{equation} R_{xx} = K_{xx} = \sigma_{0}^{2}I\end{equation}

where $I$ is the identity matrix.

Practically speaking, any property that applies to the correlation matrix also applies to the covariance matrix, although the converse is not true. I find estimating covariance matrix to be more tedious and doesn't always provide a huge benefit, especially when dealing with higher dimensional data, but there very well could be practical use for it that I'm unaware of. To estimate it, you have to subtract off the mean.

Responding to your updated question, that is not how you would calculate the covariance matrix. You would still use the corrmtx function, but your data input would be the data with the mean subtracted off. If you wanted to mimic ensemble averaging, you could form an $N$-by-$M$ matrix that contains $M$ draws of an $N$ length random vector, where $M \gg N$. You would then take the outer product of each row and then average the $M$ outer products.

Baddioes
  • 733
  • 1
  • 6
  • Bit uneasy about not using the word "autocovariance matrix" when we're really talking about one and the same and not two variables. – really not Constantine A. B. Jan 26 '24 at 10:28
  • @Baddioes Thanks for explaining a good bit of background about this! To be more specific, I am looking for an NxN covariance for an arbitrary coloured noise data vector, so maybe the right term is expected covariance matrix? (see the update in the question). – sdnemina Jan 26 '24 at 17:08
  • @ConstantineA.B. you’re not wrong. I just prefer to call it the correlation matrix because the unbiased method of time-averaging is known as the “covariance” method, and the pre- and post-windowed biased method is known as the “autocorrelation” method. – Baddioes Jan 26 '24 at 17:16
  • @sdnemina See the edit to my answer. – Baddioes Jan 26 '24 at 18:09