Solved – Incremental Gaussian Process Regression

covariancegaussian processlinear algebraonline-algorithmsregression

I want to implement an incremental gaussian process regression using a sliding window over the data points which arrives one by one through a stream.

Let $d$ denote the dimensionality of the input space. So, every data point $x_i$ has $d$ number of elements.

Let $n$ be the size of the sliding window.

In order to make predictions, I need to compute the inverse of the gram matrix $K$, where $K_{ij} = k(x_i, x_j)$ and k is the squared exponential kernel.

In order to avoid K getting bigger with every new data point, I thought I could remove the oldest data point before adding new points and this way I prevent the the gram from growing. For example, let $K = \phi(X)^{T}\Sigma\phi(X)$ where $\Sigma$ is the covariance of the weights and $\phi$ is the implicit mapping function implied by the squared exponential kernel.

Now let $X=[x_{t-n+1}|x_{t-n+2}|…|x_{t}$] and $X_{new}=[x_{t-n+2}|…|x_{t}|x_{t+1}]$ where $x$'s are $d$ by $1$ column matrices.

I need an effective way to find the $K_{new}^{-1}$ potentially using $K$. This doesn't look like the inverse of a rank-1 updated matrix problem that can be efficiently dealt with Sherman-Morrison formula.

Best Answer

There have been several recursive algorithms for doing this. You should take a look at kernel recursive least squares (KRLS) algorithm, and related online GP algorithms.

Related Question