Determinant with rows $a_1$ to $a_n$ with $-x$ on the diagonal

determinantlinear algebra

$\left|\begin{matrix}
-x&a_2&\cdots&a_{n}\\
a_1&-x&\cdots&a_{n}\\
a_1&a_2&\cdots&a_{n}\\
\vdots&\vdots&\ddots&\vdots\\
a_1&a_2&\cdots&-x
\end{matrix}\right|$

so i tried to find and expression $D_{n}=kD_{n-1}+tD_{n-2}$
by doing this $\left|\begin{matrix}
-x&a_2&\cdots&a_{n}\\
a_1&-x&\cdots&a_{n}\\
a_1&a_2&\cdots&a_{n}\\
\vdots&\vdots&\ddots&\vdots\\
a_1&a_2&\cdots&a_n-x-a_n
\end{matrix}\right|$
so we split it and get
$\left|\begin{matrix}
-x&a_2&\cdots&a_{n}\\
a_1&-x&\cdots&a_{n}\\
a_1&a_2&\cdots&a_{n}\\
\vdots&\vdots&\ddots&\vdots\\
a_1&a_2&\cdots&a_n
\end{matrix}\right|$
which is an upper trianglular matrix hence the determinant is $\prod_{1\le i\le n}(-x-a_i)$

$\left|\begin{matrix}
-x&a_2&\cdots&0\\
a_1&-x&\cdots&0\\
a_1&a_2&\cdots&0\\
\vdots&\vdots&\ddots&\vdots\\
a_1&a_2&\cdots&-x-a_n
\end{matrix}\right|$
now i expand it along the last column and get $(-x-a_n)D_{n-1}$ but the expression gets very complicated , can i please get a hint on a good way to solve it , i didnt see a better way by manipulating rows/columns(e.g. adding/subtracting all columns to the first but since i miss the $a_i$ in the $i$-th row i cant manipulate the determinant better).

Best Answer

Subtract the last row from each preceding row to obtain

$$D_n(a_1, \ldots, a_n;x) = \left|\begin{matrix} -x&a_2&\cdots&a_{n}\\ a_1&-x&\cdots&a_{n}\\ \vdots&\vdots&\ddots&\vdots\\ a_1&a_2&\cdots&-x \end{matrix}\right| = \left|\begin{matrix} -x-a_1&0&\cdots&0&a_{n}+x\\ 0&-x-a_2&\cdots&0&a_{n}+x\\ \vdots&\vdots&\ddots&\vdots&\vdots\\ 0&0&\cdots&-x-a_{n-1}&a_n+x\\ a_1&a_2&\cdots&a_{n-1}&-x \end{matrix}\right|$$

Now expand along the first column and iterate the procedure: $$= (-x-a_1)\left|\begin{matrix} -x-a_2&\cdots&0&a_{n}+x\\ \vdots&\ddots&\vdots&\vdots\\ 0&\cdots&-x-a_{n-1}&a_n+x\\ a_2&\cdots&a_{n-1}&-x \end{matrix}\right|+(-1)^{n+1}a_1\left|\begin{matrix} 0&\cdots&0&a_{n}+x\\ -x-a_2&\cdots&0&a_{n}+x\\ \vdots&\ddots&\vdots&\vdots\\ 0&\cdots&-x-a_{n-1}&a_n+x\\ a_2&\cdots&a_{n-1}&-x \end{matrix}\right|$$ \begin{align} &=(-x-a_1)D_{n-1}(a_2, \ldots, a_n;x)+a_1(-x-a_2)\cdots(-x-a_{n-1})\\ &=(-x-a_1)(-x-a_2)D_{n-2}(a_3, \ldots, a_n;x)+a_1(-x-a_2)\cdots(-x-a_{n-1})+a_2(-x-a_1)(-x-a_3)\cdots (-x-a_n)\\ &=\cdots\\ &=(-x-a_1)\cdots(-x-a_n)+\sum_{i=1}^n a_i(-x-a_1)\cdots(-x-a_{i-1})(-x-a_{i+1})\cdots (-x-a_n) \end{align}

which is the same as @user1551's result.

Related Question