Let $T(n)=S(n)+an+b$, where $a,b$ will be decided later...
Then
$$S(n)+an+b=S(n-1)+an-a+b +S(n-2)+an-2a+b -(4n-13)$$
Thus
$$S(n)=S(n-1)+S(n-2) +an-3a+b -(4n-13) \,.$$
Now, if we can make
$$an-3a+b=4n-13 \,, (*)$$
we get
$$S(n)=S(n-1)+S(n-2) \,,$$
and hence,as in Fibonnaci,
$$\begin{bmatrix}S(n+1) \\\\ S(n) \end{bmatrix} = A^n \begin{bmatrix}S(1) \\\\ S(0)\end{bmatrix}$$
you can now calculate $S(n)$ within $O(\log n)$ time, and to get $T(n)$ you need to add $an+b$, where $a,b$ are calculated from $(*)$.
Multiplying two $2\times 2$ matrices can be done with eight multiplications and four additions, so yes, that is constant time. (In fact it can be done in fewer multiplications at the cost of more additions, but the point remains.)
Multiplying a $2\times 2$ matrix by $\begin{pmatrix}1 & 1\\1 & 0\\\end{pmatrix}$ can be done with just two additions, which is even faster.
But computing $F_n$ using $F_n=F_{n-1}+F_{n-2}$ takes just one addition, so why use matrices at all?
The answer is that computing $M^n$ for a matrix $M$ can actually be done using $O(\log n)$ multiplications and additions, using exponentiation by squaring. Check it out!
Best Answer
If $a_{n} = a_{n-1} + a_{n-2}$, then $$ {\begin{bmatrix} 1 & 1 \\ 1 & 0 \\ \end{bmatrix}}^n \begin{bmatrix} a_1 \\ a_0 \\ \end{bmatrix} = \begin{bmatrix} a_{n+1} \\ a_n \end{bmatrix} $$ for any $a_0, a_1$. Any linear recurrence relation can be solved using matrix exponentiation, e.g., $a_{n}=a_{n-1}-3a_{n-3}+5a_{n-5}-7$. See this blog post: Recurrence relation and matrix exponentiation.