Solved – Multivariate exponential smoothing and Kalman filter equivalence

exponential-smoothingfilterkalman filter

Suppose the time-series $X$ is hidden state Gaussian random walk and we observe $Y = X + e$, where $e$ is gaussian white noise independent of $X$.

The Kalman estimator of $X$ in this case has a steady state closed form solution and corresponds to an exponential moving average smoother with constant smoothing parameter. The optimal smoothing parameter looks like $\lambda = \frac{p}{1-p}$ where $p$ is a quadratic formula of signal to noise ratio between $e$ and $\Delta{x}$. See formula for closed form (by searching 'kalman solu‌​tion random walk noise').

If we have instead multiple independent observations $Y_1, Y_2,…$, i.e. independent $e_1, e_2,…$ is there a closed form solution for the optimal estimator of $X$? What would it look like?

If the $e_1, e_2$ were independent and identically distributed (same standard deviation), I can imagine the best estimator for $X$ is to simply average the estimators obtained treating the problem as univariate for each $Y$.

If the $e_1,e_2,…$ have different variance, then some $Y$ series should have low weightings in the overall estimator as their signal to noise ratio is poorer. Perhaps the closed form solution is linear with coefficients proportional to each $Y$ signal to noise ratio.

Is there a known closed form solution? Google seems not to be too helpful on this problem.

Best Answer

You haven't specified but if the observation noise is independent of the model noise then I think you should first combine the Y estimates with simple observation variance weighted mean as specified here:

http://en.wikipedia.org/wiki/Weighted_mean#Dealing_with_variance

after that apply kalman filter as if it is a univariate model. As a result you will have a lower observation noise hence a better estimate of x.

Related Question