Independence is 'stronger' than whiteness. I believe that independence between the random variables implies whiteness but whiteness does not imply independence. Whiteness means that the random variables are uncorrelated but not necessarily independent.
The use of i.i.d noise is seen very often when formulating probabilistic models because it makes inference much easier. For instance if two random variables $X_1$ and $X_2$ are i.i.d it means that the joint pdf $p(X_1,X_2)$ factors into the product of the individual pdfs $p(X_1)p(X_2)$. If the two random variables are uncorrelated this factorization is not valid.
In ML estimation typically the log of the product is considered $\log(p(X_1)p(X_2)) = \sum \log p(X)$ because then differentiation with respect to the parameter of interest is much more straight forward.
I'm not sure where to find i.i.d noise in the real world but I believe that the assumption about i.i.d observation noise is made more of convenience than because it is realistic.
In the answer, In ML estimation typically the log of the product is considered log(p(X1)p(X2))=∑logp(X).Then can I tell like this that typically the product is considered (p(X1)p(X2))=∑p(X). If yes, can I also write like this ? (p(X1)p(X2))= p(X1)+p(X2). (I just rephrased above equation because to find whether my idea correct or not).
– gmotree Jun 05 '15 at 16:30As I know from you said, the i.i.d noise can look like joint pdf p(x1,x2) from random variables.
Then How can we estimate std deviation from joint pdf p(x1,x2)?
– gmotree Jun 08 '15 at 00:23