[Math] Determinant of an almost-diagonal matrix

determinantlinear algebramatricesself-learning

I would like to compute the determinant of the $(k+1)\times (k+1)$ matrix below

$$J=\begin{vmatrix} y_{k+1}& 0 & \ldots & 0 & y_1 \\
0& y_{k+1}& \ldots& 0& y_2 \\ \vdots& \vdots& & \vdots &\vdots \\0 & 0&\ldots& y_{k+1} &y_k \\ -y_{k+1} & -y_{k+1} &\ldots &-y_{k+1}& \left(1-y_1-\ldots-y_k \right) \end{vmatrix} $$

The matrix is diagonal, if you exclude the $(k+1)$th row and column. This is where I get confused. I have tried elementary the Laplace expansion by the first column and used elementary row operations to get columns with just 1 non-zero element but that hasn't gotten me very far.

Is there maybe an easier way to get the determinant? Thank you in advance.

Best Answer

Add the first $k$ rows to the last row. This makes your matrix upper triangular and shows that $\det J = y_{k+1}^k$.