How can one compute the determinant of a tridiagonal matrix when using integers

determinantmatricestridiagonal-matrices

My question is similar to How to compute the determinant of a tridiagonal matrix with constant diagonals?
However, all of the options seem to include computations that are not permitted for integers. I am trying to find the determinant for a matrix in integers, more specifically, this matrix:

$X = \begin{bmatrix}
2 & -1 & 0 & 0 & \cdots & 0 & 0 \\
-1 & 2 & -1 & 0 & \ddots & 0 & 0 \\
0 & -1 & 2 & -1 & \ddots & 0 & 0 \\
\vdots & \ddots & \ddots & \ddots & \ddots & \vdots & \vdots \\
0 & 0 & 0 & \cdots & -1 & 2 & -1 \\
0 & 0 & 0 & \cdots & 0 & -1 & 2 \\
\end{bmatrix} \in \mathbb{Z}^{n,n}$

Computing the determinant for smaller $n$ with the Rule of Sarrus gave me the suspicion that it is always $n + 1$, however, how do I show this for larger $n$ as well?

Best Answer

While I don't quite understand why you only want to use integers, you can prove this by induction. Let $d_n$ be the determinant of your matrix of size $n \times n$. Then $d_1 = 2$, $d_2 = 3$. So assume that $d_k = k+1$ for all $k < n$. By Laplace expansion and the induction hypothesis, we get $$d_n = 2d_{n-1} - d_{n-2} = 2n - (n-1) = n+1$$ and we are done.

Related Question