Express a vector as a linear combination of two other vectors.

linear algebra

Let $A = \begin{bmatrix}
1 & -1 & 1 \\
1 & 0 & 2 \\
-1 & 2 & 0
\end{bmatrix}$
and $B = \begin{bmatrix}
1 & 0 & 2 \\
0 & 1 & 1 \\
0 & 0 & 0
\end{bmatrix}$
. $A$ row reduces to $B$ but you are asked not to verify this.
1. Find a basis for the column space $\text{col}(A)$.
From $B$, it is clear that the first and second columns of the matrix are pivot columns. Thus, the first and second columns of $A$ form a basis, i.e.,
\begin{equation*}
\left\{\begin{pmatrix}
1 \\
1 \\
-1 \\
\end{pmatrix}, \begin{pmatrix}
-1 \\
0 \\
2
\end{pmatrix}\right\}.
\end{equation*}

2. Determine the vector of $\text{col}(A)$ which is nearest to the vector $\begin{bmatrix}
2 \\
-2 \\
0
\end{bmatrix}$
.
The vectors that span $C = \text{col}(A)$ are all linearly independent. We first find the projection matrix given by $P = C(C^TC)^{-1}C^T$. Lets first calculate $C^TC$
\begin{equation*}
C^TC = \begin{bmatrix}
1 & 1 & -1 \\
-1 & 0 & 2
\end{bmatrix}
\begin{bmatrix}
1 & -1 \\
1 & 0 \\
-1 & 2
\end{bmatrix} = \begin{bmatrix}
3 & -3 \\
-3 & 5
\end{bmatrix}.
\end{equation*}

This means that
\begin{equation*}
(A^TA)^{-1} = \frac{1}{6}\begin{bmatrix}
5 & 3 \\
3 & 3
\end{bmatrix}.
\end{equation*}

Hence,
\begin{equation*}
\begin{split}
P = A(A^TA)^{-1}A^T &= \frac{1}{6}\begin{bmatrix}
1 & -1 \\
1 & 0 \\
-1 & 2
\end{bmatrix}
\begin{bmatrix}
5 & 3 \\
3 & 3
\end{bmatrix}
\begin{bmatrix}
1 & 1 & -1 \\
-1 & 0 & 2
\end{bmatrix} \\
&= \frac{1}{6}\begin{bmatrix}
1 & -1 \\
1 & 0 \\
-1 & 2
\end{bmatrix}
\begin{bmatrix}
2 & 5 & 1 \\
0 & 3 & 3
\end{bmatrix} \\
&= \frac{1}{6}\begin{bmatrix}
2 & 2 & -2 \\
2 & 5 & 1 \\
-2 & 1 & 5
\end{bmatrix}.
\end{split}
\end{equation*}

So the projection of $(2,-2,0)$ onto $C$ is
\begin{equation*}
\frac{1}{6}\begin{bmatrix}
2 & 2 & -2 \\
2 & 5 & 1 \\
-2 & 1 & 5
\end{bmatrix}
\begin{bmatrix}
2 \\
-2 \\
0
\end{bmatrix} = \begin{bmatrix}
0 \\
-1 \\
-1
\end{bmatrix}.
\end{equation*}

3. Express the vector $\begin{bmatrix}
2 \\
-2 \\
0
\end{bmatrix}$
in $\mathbb{R}^3$ as a linear combination of two vectors, one in $\text{col}(A)$ and the other in $\text{col}(A)^{\perp}$.
Not sure how to do this one any help would be great!

Best Answer

Just subtract $\begin{bmatrix} 0 \\ -1 \\ -1 \end{bmatrix}$ from $\begin{bmatrix} 2 \\ -2 \\ 0 \end{bmatrix}$. That vector must be in $\text{col}(A)^{\perp}$, by properties of projections. Indeed, you can check its perpendicularity to your basis for col$(A)$ by computing the dot product.