[Math] In each part, determine whether the matrices are linearly independent or dependent.

linear algebra

enter image description here

I'm not quite sure how to do this as there was no example on linear dependence for matrices specifically in the book..

I tried seeing if the example in this post would work Determining Linear Dependence/Independence of vectors in R2x2?

but I dont know how he added the elements up to get c1 c2 c3 and then c1 again… that part lost me.. if that is even the right method.. how do I solve this?

Best Answer

You're looking for such $\alpha, \beta, \gamma \in \mathbb{R}$ that $\alpha \neq 0$ or $\beta \neq 0$ or $\gamma \neq 0$ and:

$\displaystyle \alpha\begin{bmatrix}1 & 0 \\ 1 & 2\end{bmatrix}+\beta\begin{bmatrix}1 & 2 \\ 2 & 1\end{bmatrix}+\gamma\begin{bmatrix}0 & 1 \\ 2 & 1\end{bmatrix}=\begin{bmatrix}0 & 0 \\ 0 & 0\end{bmatrix}$

If it's possible to find such $\alpha,\beta,\gamma$ matrices are linear dependent, if it's not possible are linear independent.

In this case first you look at first row and first column: it must be $\alpha=-\beta$, so:

$\displaystyle \alpha\begin{bmatrix}1 & 0 \\ 1 & 2\end{bmatrix}-\alpha\begin{bmatrix}1 & 2 \\ 2 & 1\end{bmatrix}+\gamma\begin{bmatrix}0 & 1 \\ 2 & 1\end{bmatrix}=\begin{bmatrix}0 & 0 \\ 0 & 0\end{bmatrix}$

$\displaystyle \alpha\begin{bmatrix}0 & -2 \\ -1 & 1\end{bmatrix}+\gamma\begin{bmatrix}0 & 1 \\ 2 & 1\end{bmatrix}=\begin{bmatrix}0 & 0 \\ 0 & 0\end{bmatrix}$

Now look at second row second column, you must have $\alpha=-\gamma$, but when $\alpha=-\gamma$ you have:

$\displaystyle \alpha\begin{bmatrix}0 & -3 \\ -3 & 0\end{bmatrix}=\begin{bmatrix}0 & 0 \\ 0 & 0\end{bmatrix}$