[Math] Find Coordinate Vector with Respect to Given Basis

linear algebramatricesproof-verificationvectors

I took Linear Algebra in the spring and I'm still having trouble with one concept so I figured I'd ask one of the test questions that I continue to struggle with. I would appreciate it if anyone could check my answer to part (a) and assist me in completing part (b).

The set H of 2 x 2 matrices satisfying $A = A^t$ is a subspace of the set of all 2 x 2 matrices. There is a basis of H given by $B = \{b_1, b_2, b_3\}$, where the $b_i$ are:

$$b_1 =\begin{bmatrix}1 & 0 \\0 & 0\end{bmatrix} b_2 =\begin{bmatrix}1 & 1 \\1 & 0\end{bmatrix} b_3 =\begin{bmatrix}1 & 1 \\1 & 1\end{bmatrix}$$

(a) With respect to this basis, a vector v has coordinates $[v]_b = \begin{bmatrix}1\\-1\\3\end{bmatrix}$.
What is v?

My Work:

$$v = 1\begin{bmatrix}1 & 0\\0 & 0\end{bmatrix} – 1\begin{bmatrix}1 & 1\\1 & 0\end{bmatrix} + 3\begin{bmatrix}1 & 1\\1 & 1\end{bmatrix}$$

$$v = \begin{bmatrix}1 & 0\\0 & 0\end{bmatrix} + \begin{bmatrix}-1 & -1\\-1 & 0\end{bmatrix} + \begin{bmatrix}3 & 3\\3 & 3\end{bmatrix}$$

$$v = \begin{bmatrix}3 & 2\\2 & 3\end{bmatrix}$$ I not sure if my answer to part (a) is correct.

(b) Find $[v]_b$ if $v = \begin{bmatrix}-4 & 2 \\2 & 1\end{bmatrix}$.

My Work:

For this part I get very confused on how to set up the problem. I understand how to find the coordinate vector when the basis is composed of two 2 x 1 matrices. But when they have a higher order I struggle. I would begin this problem by setting up a matrix equation in the form $Ax = b$

$$ \begin{bmatrix}Basis\end{bmatrix} \begin{bmatrix}x_1\\x_2\end{bmatrix} = \begin{bmatrix}-4 & 2 \\2 & 1\end{bmatrix}$$

I'm unsure how a basis consisting of three 2 x 2 vectors can be multiplied by a 2 x 1 matrix and yield a 2 x 2 matrix. It is very likely that I am approaching part (b) incorrectly.

EDIT: Solution

I was able to find the solution by setting up the following equation:

$$\begin{bmatrix}-4 & 2 \\2 & 1\end{bmatrix} = x_1\begin{bmatrix}1 & 0\\0 & 0\end{bmatrix} + x_2\begin{bmatrix}1 & 1\\1 & 0\end{bmatrix} + x_3\begin{bmatrix}1 & 1\\1 & 1\end{bmatrix}$$

$$[v]_b = \begin{bmatrix}-6 \\1 \\ 1\end{bmatrix}$$

Best Answer

Going back to basic definitions, as you did in your solution, is a good way to go when you get stuck. Because of the particular basis that you’ve been given, you could also have found the coordinates by working backwards through the basis. Essentially, you solve the equation that you’ve set up by back-substitution. Observe that the only element of the basis that has a non-zero lower-right element is $b_3$. Its value is $1$, so the coefficient of $b_3$—the third coordinate—must be $1/1=1$. Subtract $b_3$ from $v$. Looking at the remaining basis vectors, we see that the upper-right and lower-left elements of $v-b_3$ bad better be equal, otherwise $v$ isn’t in this subspace (in fact, those elements of $v$ need to be equal in the first place). These elements are zero in $b_1$, so the second coordinate is completely determined by those entries of $v-b_3$, i.e., it is $1/1=1$. Whatever’s left over in the upper-left of $v-b_2-b_3$ is then the first coordinate.

Now, you could also have solved this with a matrix equation of the same form as in your first attempt, but to do this you have to “flatten” the matrices via an isomorphism along the lines of $$\phi:\begin{bmatrix}a&b\\c&d\end{bmatrix}\mapsto\begin{bmatrix}a\\b\\c\\d\end{bmatrix}.$$ By linearity, if $v=x_1b_1+x_2b_2+x_3b_3$, then $\phi(v)=x_1\phi(b_1)+x_2\phi(b_2)+x_3\phi(b_3)$, so the coordinates are the same either way. The equation to solve would then be $$\begin{bmatrix}1&1&1\\0&1&1\\0&1&1\\0&0&1\end{bmatrix}\begin{bmatrix}x_1\\x_2\\x_3\end{bmatrix}=\begin{bmatrix}-4\\2\\2\\1\end{bmatrix}.$$ The second and third equations of the system that this matrix equation represents are clearly redundant, which is what we’d expect since the entries of the corresponding matrix elements are identical in each basis matrix.

Related Question