6

I want to know the difference between independent and identically distributed (i.i.d) noise and white noise.

In my short knowledge, i.i.d is that there is no relationship about time dependency. White noise means that there are relationship about time dependency.

Actually, I'm not sure whether this is correct or not. Also I want to know what is an i.i.d white noise.

Can you tell me where we find the iid noise in the nature?

update

enter image description here enter image description here

lennon310
  • 3,590
  • 19
  • 24
  • 27
gmotree
  • 651
  • 4
  • 11
  • 16

1 Answers1

6

Independence is 'stronger' than whiteness. I believe that independence between the random variables implies whiteness but whiteness does not imply independence. Whiteness means that the random variables are uncorrelated but not necessarily independent.

The use of i.i.d noise is seen very often when formulating probabilistic models because it makes inference much easier. For instance if two random variables $X_1$ and $X_2$ are i.i.d it means that the joint pdf $p(X_1,X_2)$ factors into the product of the individual pdfs $p(X_1)p(X_2)$. If the two random variables are uncorrelated this factorization is not valid.

In ML estimation typically the log of the product is considered $\log(p(X_1)p(X_2)) = \sum \log p(X)$ because then differentiation with respect to the parameter of interest is much more straight forward.

I'm not sure where to find i.i.d noise in the real world but I believe that the assumption about i.i.d observation noise is made more of convenience than because it is realistic.

niaren
  • 1,928
  • 16
  • 14
  • 1
    It should also be noted that for Gaussian random variables, the terms uncorrelated and independent are synonomous; if two Gaussian random variables are uncorrelated, then they are also independent. This is a particular feature of the Gaussian distribution, and isn't generally applicable. So the terms iid and white are similar when the underlying distribution is Gaussian (with the caveat that iid implies that all of the random variables are identically distributed, which is stronger than just saying that all of the random variables are jointly uncorrelated). – Jason R Jun 05 '15 at 15:05
  • @Jason R Thanks,Then can you tell me the following questions of mine?

    In the answer, In ML estimation typically the log of the product is considered log(p(X1)p(X2))=∑logp(X).Then can I tell like this that typically the product is considered (p(X1)p(X2))=∑p(X). If yes, can I also write like this ? (p(X1)p(X2))= p(X1)+p(X2). (I just rephrased above equation because to find whether my idea correct or not).

    – gmotree Jun 05 '15 at 16:30
  • 1
    I don't follow how you got from the first equation there to the second. The first relies upon a property of the logarithm, namely that $\log{xy} = \log{x} + \log{y}$. – Jason R Jun 05 '15 at 16:34
  • @JasonR Thanks, I have uploaded 2 picture. the first one is the goal what I want to estimate standard deviation from random variables, ans second is I understand until now. – gmotree Jun 05 '15 at 17:12
  • I have no idea what you're asking, sorry. – Jason R Jun 05 '15 at 17:25
  • @JasonR Is there any relationship between "ML estimation" and "estimated standard deviation"? – gmotree Jun 05 '15 at 17:29
  • That's somewhat of a nonsensical question. Maximum-likelihood is a method used to estimate properties of a random variable. The standard deviation could be one of those properties, but your question doesn't really make sense. I don't think I can help you out here. – Jason R Jun 05 '15 at 17:37
  • Sir, gaussian seems like the same poisson distribute. The terms about your uncorrelated andindependent are synonomous; – gmotree Jun 05 '15 at 18:19
  • 4
    @JasonR Your statement "It should also be noted that for Gaussian random variables, the terms uncorrelated and independent are synonomous; if two Gaussian random variables are uncorrelated, then they are also independent." is incorrect. The standard counter-example is $X \sim N(0,1)$, $Z$ independent of $X$ and equally likely to take on values $+1$ and $-1$, and $Y = ZX$. It is straightforward to verify that $Y \sim N(0,1)$ while $$\operatorname{cov}(X,Y) = E[XY] - E[X]E[Y]= E[X^2Z]=E[X^2]E[Z] = 0.$$ – Dilip Sarwate Jun 05 '15 at 18:40
  • 1
    @DilipSarwate: Indeed, you're right. This is a popular enough misconception that there is a Wikipedia article about it. Only a pair of jointly Gaussian random variables has the property that I referenced. – Jason R Jun 05 '15 at 18:54
  • @niaren : I am confusing about estimate the standard deviation from i.i.d noise.

    As I know from you said, the i.i.d noise can look like joint pdf p(x1,x2) from random variables.

    Then How can we estimate std deviation from joint pdf p(x1,x2)?

    – gmotree Jun 08 '15 at 00:23