Solved – uncorrelated noise and its significance

estimationnoisesignal processing

In many applications such as estimation theory, when we need to estimate a parameter then we usually consider in presence of white gaussian noise of zero mean and some standard deviation. During Maximum likelihood estimation, we also use this assumption. So, my question is –

  1. Do we consider noise to be uncorrelated or correlated in estimation?

  2. What is the difference between correlated and uncorrelated noise and its significance

  3. Why do we consider such a property of correlated or uncorrelated in estimation and when we say that measurement noise is gaussian.

Best Answer

  1. "Noise" implies portions of an estimate that are random (e.g. unknowable, except in terms of distributional behavior). We consider noise to be uncorrelated. I suppose that within the realm of time series analysis processes like pure random walks might be considered a category of "correlated noise" although I would say that's a bit of a misnomer: random walks have (nonlinear) deterministic and random (noise) components.

  2. "Correlated noise" reads to me like an oxymoron... a little like saying "all the things we know about things we do not know about."

  3. Estimation error is not always assumed to be Gaussian, but is very commonly presumed to be so. I would imagine that the significance of the central limit theorem, and it's deep relationship to so many distributions and processes makes it such a commonly assumed for of noise.