Solved – Kalman Filter Expectation Maximization

expectation-maximizationkalman filterpython

I'm not very familiar with the EM algorithm for the Kalman Filter. I've been using pykalman to do my analysis in Python. The package comes with a simple EM algo:

kf = KalmanFilter(transition_matrices = [[1, 1], [0, 1]], 
                  observation_matrices = [[0.1, 0.5], [-0.3, 0.0]])
measurements = np.asarray([[1,0], [0,0], [0,1]])  # 3 observations
kf = kf.em(measurements, n_iter=5)
(filtered_state_means, filtered_state_covariances) = kf.filter(measurements)

I was wondering whether using the EM algorithm to estimate the Kalman filter parameters induces some kind of look-ahead bias. Like does the EM algorithm use the full sample observation points to estimate the parameters, or does it only use the points from time t=0 to time t-1?

Best Answer

I don't see a reason for the EM algorithm to induce a look-ahead bias. As mentioned in the comments of this piece of code related to the implementation that you mention, the EM algorithm uses the Kalman smoother to evaluate the expected likelihood. Unlike the filtered stated vector, the smoothed state vector is estimated conditional on the entire observed series, from $t=0,\dots,n$ (not just to $t-1$).

The description of the EM algorithm given in Section 3.1 of this document in the context of a structural time series model may give you a further insight into the inner workings of this algorithm.

Related Question