[Math] Linear dependence of a set for what h

linear algebra

I asked the same question yesterday, but this one is a bit different in terms of computations. It is from my exam I took an hour ago.

For what $h$ the columns of this matrix are linearly dependent? $$\begin{bmatrix} 1 & -3 & 4 \\ -4 & 7 & h\\ 2 & -6 & 8 \end{bmatrix}$$

Attempt: after row reducing, but not completely:
$$\begin{bmatrix} 1 & -3 & 4 & 0 \\ -4 & 7 & h & 0 \\ 2 & -6 & 8 & 0 \end{bmatrix} \sim \begin{bmatrix} 1 & -3 & 4 & 0 \\ 0 & -5 & h+16 & 0 \\ 0 & 0 & 0 & 0 \end{bmatrix} $$

My guess was that if $h=-16;-\frac{28}{3}$ the system is linearly dependent. And I just guessed the -16. Hints please.

Best Answer

You are asking: for what values of $h$ are the vectors $$\vec{v_1}=\left(\begin{array}{r}1\\-4\\2\end{array}\right),\quad \vec{v_2}=\left(\begin{array}{r}-3\\7\\-6\end{array}\right),\quad \vec{v_3}=\left(\begin{array}{r}4\\h\\8\end{array}\right)$$ linearly dependent?

You seem to be trying to do this by looking at the equation $$\alpha\vec{v_1}+\beta\vec{v_2}+\gamma\vec{v_3}=\left(\begin{array}{c}0\\0\\0\end{array}\right)$$ and trying to determine for what values of $h$ there is a nonzero solution. This leads to the matrix you have: $$\left(\begin{array}{rrr|c} 1 & -3 & 4 & 0\\ -4 & 7 & h & 0\\ 2 & -6 & 8 & 0 \end{array}\right).$$ Now, since the third equation is a multiple of the first, that equation does not matter: it provides no new information. That means that you have a homogeneous system of two equations in three unknowns. Those systems always have infinitely many solutions. In particular, no matter what $h$ is, the system has infinitely many solutions, and so must have a nontrivial solution. Thus, the vectors are always linearly dependent.

To understand what is happening, note that all three vectors lie in the plane $z=2x$. Any two vectors on the plane that are not collinear will span the plane. Since $\vec{v_1}$ and $\vec{v_2}$ are not collinear, and both lie on the plane $z=2x$, any vector that lies on the plane $z=2x$ will be a linear combination of $\vec{v_1}$ and $\vec{v_2}$. Or, put another way, three vectors in a $2$-dimensional space (a plane through the origin) are always linearly dependent.

Here you have three vectors that satisfies $z=2x$; every other vector that satisfies that is a linear combination of $\vec{v_1}$ and $\vec{v_2}$: if $(a,b,2a)^t$ lies in the plane, then the system $$\alpha\left(\begin{array}{r}1\\-4\\2\end{array}\right) + \beta\left(\begin{array}{r}-3\\7\\-6\end{array}\right) = \left(\begin{array}{c}a\\b\\2a\end{array}\right)$$ has a solution, namely $\alpha = -\frac{7a+3b}{5}$, $\beta=-\frac{4a+b}{5}$ (obtained by Gaussian elimination). In particular, since no matter what $h$ is $\vec{v_3}$ lies in the plane $2z=x$, then we will have $$\vec{v_3} = -\frac{28+3h}{5}\vec{v_1} - \frac{16+h}{5}\vec{v_2}.$$ Note that this makes sense no matter what $h$ is.

This can be read off your row-reduced matrix: you got $$\left(\begin{array}{rrr|c} 1 & -3 & 4 & 0\\ 0 & -5 & h+16 & 0\\ 0 & 0 & 0 & 0 \end{array}\right).$$ Divide the second row by $-5$ to get $$\left(\begin{array}{rrr|c} 1 & -3 & 4 & 0\\ 0 & 1 & -\frac{h+16}{5} & 0\\ 0 & 0 & 0 & 0 \end{array}\right),$$ and now add three times the second row to the first row to get $$\left(\begin{array}{rrc|c} 1 & 0 & 4+\frac{-3h-48}{5} & 0\\ 0 & 1 & -\frac{h+16}{5} & 0\\ 0 & 0 & 0 & 0 \end{array}\right) = \left(\begin{array}{rrc|c} 1 & 0 & -\frac{28+3h}{5} & 0\\ 0 & 1 & -\frac{h+16}{5} & 0\\ 0 & 0 & 0& 0 \end{array}\right).$$ So $\alpha$ and $\beta$ are leading variables, and $\gamma$ is a free variable. This tells you that the solutions to the original system are: $$\begin{align*} \alpha &= \frac{28+3h}{5}t\\ \beta &= \frac{h+16}{5}t\\ \gamma &= t \end{align*}$$ Any nonzero value of $t$ gives you a nontrivial solution, and $t=-1$ gives you the solution I give above.

Of course, this can be done much more simply noting that since your original matrix has linearly dependent rows (third row is a scalar multiple of the first row), then the dimension of the rowspace is at most $2$ (in fact, exactly $2$), and hence the dimension of the columnspace is at most $2$ (in fact, exactly $2$, since $\dim(\text{columnspace})=\dim(\text{rowspace})$, so the columns are always linearly dependent.