Incremental solution for matrix inverse using Shermann-Morrison in $O(n^2)$

linear algebramachine learningmatricesnumerical linear algebra

I have been reading a presentation on Value Function Approximation by David Silver (Introduction to Reinforcement Learning Course). On page 43 he finds a solution for linear least squares for an unknown parameter vector $\bf{w}$. He claims that, for $N$ features, inverting a matrix is $O(N^3)$, but an incremental solution using Sherman-Morrison algorithm with complexity of $O(N^2)$ is possible.

see image Since Sherman–Morrison formula
allows for computing an inverse of the sum of an invertible matrix $\bf{A}$ and the outer product, $\bf{uv^T}$, of vectors $\bf{u}$ and $\bf{v}$, I don't really see how this applies to this case as well.

Best Answer

At time $T-1$, we have computed $$\left(\sum_{t=1}^{T-1}x(s_t)x(s_t)^T \right)^{-1}.$$

Now, at time $T$, we want to compute

$$\left(\sum_{t=1}^{T-1}x(s_t)x(s_t)^T + x(s_T)x(s_T)^T\right)^{-1}.$$

Let $A= \sum_{t=1}^{T-1}x(s_t)x(s_t)^T$, $u=v=x(s_T).$