[Math] How to justify this without determining the determinant

determinantlinear algebra

I need to justify the following equation is true:

$$
\begin{vmatrix}
a_1+b_1x & a_1x+b_1 & c_1 \\
a_2+b_2x & a_2x+b_2 & c_2 \\
a_3+b_3x & a_3x+b_3 & c_3 \\
\end{vmatrix} = (1-x^2)\cdot\begin{vmatrix}
a_1 & b_1 & c_1 \\
a_2 & b_2 & c_2 \\
a_3 & b_3 & c_3 \\
\end{vmatrix}
$$

I tried dividing the determinant of the first matrix in the sum of two, so the first would not have $b's$ and the second wouldn't have $a's$.

Then I'd multiply by $\frac 1x$ in the first column of the second matrix and the first column of the second, so I'd have $x^2$ times the sum of the determinants of the two matrices.

I could then subtract column 1 to column 2 in both matrices, and we'd have a column of zeros in both, hence the determinant is zero on both and times $x^2$ would still be zero, so I didn't prove anything. What did I do wrong?

Best Answer

For another solution, note that $$ \underbrace{\begin{bmatrix} a_1+b_1x & a_1x+b_1 & c_1 \\ a_2+b_2x & a_2x+b_2 & c_2 \\ a_3+b_3x & a_3x+b_3 & c_3 \\ \end{bmatrix}}_{A} = \underbrace{\begin{bmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \\ \end{bmatrix}}_{B} \underbrace{\begin{bmatrix} 1 & x & 0 \\ x & 1 & 0 \\ 0 & 0 & 1 \\ \end{bmatrix}}_{C} $$ and therefore $\det(A) = \det(BC) = \det(B)\det(C)$. From there, it's enough to check that $$ \det(C) = \begin{vmatrix} 1 & x & 0 \\ x & 1 & 0 \\ 0 & 0 & 1 \\ \end{vmatrix} = \begin{vmatrix}1 & x \\ x & 1\end{vmatrix} = 1 \cdot 1 - x \cdot x = 1-x^2. $$

Related Question