A basis for a vector space is by definition a spanning set which is linearly independent.
Here the vector space is 2x2 matrices, and we are asked to show that a collection of four specific matrices is a basis:
$$ \left\{\begin{bmatrix}0&1\\2&3\end{bmatrix}, \begin{bmatrix}3&4\\5&6\end{bmatrix},
\begin{bmatrix}7&8\\9&10\end{bmatrix}, \begin{bmatrix}11&12\\13&14\end{bmatrix} \right\} $$
To be a spanning set means every element of the vector space can be expressed as a linear combination (of finitely many) of elements of the given set. Here it means to show for any 2x2 matrix $\begin{bmatrix}w&x\\y&z\end{bmatrix}$ there exist scalar coefficients $a,b,c,d$ such that:
$$ \begin{bmatrix}w&x\\y&z\end{bmatrix} = a \begin{bmatrix}0&1\\2&3\end{bmatrix} + b \begin{bmatrix}3&4\\5&6\end{bmatrix} + c \begin{bmatrix}7&8\\9&10\end{bmatrix} + d \begin{bmatrix}11&12\\13&14\end{bmatrix} $$
To be linearly independent means that zero can be expressed as a linear combination (of finitely many) of elements of the given set uniquely by using zero coefficients, i.e. in the present case that:
$$ \begin{bmatrix}0&0\\0&0\end{bmatrix} = a \begin{bmatrix}0&1\\2&3\end{bmatrix} + b \begin{bmatrix}3&4\\5&6\end{bmatrix} + c \begin{bmatrix}7&8\\9&10\end{bmatrix} + d \begin{bmatrix}11&12\\13&14\end{bmatrix} $$
implies $a=b=c=d=0$.
It turns out, as Nicholas R. Peterson has already reported, that neither of these conditions is true. The four given vectors do not form a basis for the vector space of 2x2 matrices. (Some other sets of four vectors will form such a basis, but not these.)
Let's take the opportunity to explain a good way to set up the calculations, without immediately jumping to the conclusion of failure to be a basis. The spanning set and linearly independent properties are easily combined into a method for checking if a finite set is a basis, by formulating a system of linear equations with unknown coefficients. If the systems above (which differ only in the "given" matrix on the left hand side) are multiplied out to determine the four separate entries of the results on the right hand side, we see that the first condition (spanning) requires us to solve:
$$ 0a + 3b + 7c + 11d = w $$
$$ 1a + 4b + 8c + 12d = x $$
$$ 2a + 5b + 9c + 13d = y $$
$$ 3a + 6b + 10c + 14d = z $$
for any possible $w,x,y,z$, while the second condition (linear independence) requires us to show the homogeneous problem $w=x=y=z=0$ has a unique (trivial) solution. Therefore treating the above linear system will tell us everything we want to know about whether the four vectors form a basis.
Presumably the Reader is familiar with solving such systems by elimination, either using the simple expedient of substitution of variables or the more systematic method of putting an augmented matrix representing the system into reduced row echelon form. We take a middle path, of leaving the system in an equational form but performing the addition/subtraction/scaling of equations just as someone using elementary row operations might proceed.
To begin, subtract the top equation from each the three equations below it:
$$ 0a + 3b + 7c + 11d = \;w\; $$
$$ 1a + 1b + 1c + 1d = x-w $$
$$ 2a + 2b + 2c + 2d = y-w $$
$$ 3a + 3b + 3c + 3d = z-w $$
Now factor out the 2 and 3 respectively from the last two equations:
$$ 0a + 3b + 7c + 11d = \;w\; $$
$$ 1a + 1b + 1c + 1d = x-w $$
$$ 1a + 1b + 1c + 1d = \frac{y-w}{2} $$
$$ 1a + 1b + 1c + 1d = \frac{z-w}{3} $$
We begin to see that these equations cannot always be solved, because the last three left hand sides will always be equal, but the values of $w,x,y,z$ may not give the last three right hand sides as equal! Let's take the next step and subtract the second equation from the last two:
$$ 0a + 3b + 7c + 11d = \;w\; $$
$$ 1a + 1b + 1c + 1d = x-w $$
$$ 0a + 0b + 0c + 0d = \frac{y-w}{2} - (x-w) $$
$$ 0a + 0b + 0c + 0d = \frac{z-w}{3} - (x-w) $$
Can we always find a solution to these equations? No, because we can pick values for $w,x,y,z$ that make the right hand sides in either of the last two equations nonzero while any choice of $a,b,c,d$ will only give a left hand side there of zero. For example we might take $x=1$ and $w=y=z=0$, so the right hand sides of the last two equations are $-1$. The system is then inconsistent, since $0 \neq -1$.
With a little thought we can also see that the four vectors (matrices) are not linearly independent. That is some choice of $a,b,c,d$ other than simply $a=b=c=d=0$ will allow us to get all zeros on the right hand sides (since now $w=x=y=z=0$ is stipulated):
$$ 0a + 3b + 7c + 11d = 0 $$
$$ 1a + 1b + 1c + 1d = 0 $$
$$ 0a + 0b + 0c + 0d = 0 $$
$$ 0a + 0b + 0c + 0d = 0 $$
We really only have two equations (the first two of our last system) to determine four unknowns, and the Reader should be familiar with the fact that this is not possible for linear equations. If a particular nontrivial choice of $a,b,c,d$ is desired, let $c=1$ and $d=0$ in the those two equations, and work out that $b=-\frac{7}{3}$ and $a=\frac{4}{3}$. So $a=b=c=d=0$ is not the only way to express the zero 2x2 matrix as a linear combination of the four vectors.
Let me discuss as much as I have understood.Please correct me if there is anything wrong.
Taking $A\in S$ where $A=\begin{bmatrix}a & b\\c & d\\\end{bmatrix}$ we take $AA^T=A^TA$ which (on some direct calculation through the matrix multiplication) gives $b=\pm c$ and $ac+bd=ab+cd$. So if $b=c$ then it is fine (i.e $A$ is symmetric) and in case $b=-c$(with both non zero,otherwise $b=c=0$ which is done already) then we have $c(a-d)=c(d-a)$ which is possible only when $a=d$.
So we conclude : If $A\in S$ then either $A$ is symmetric or $A$ has the form $\begin{bmatrix}a & b\\-b & a\\\end{bmatrix}$
Now if we take $A,B\in S$ such that only one of them(say $A$) is symmetric then clearly we can take $A=\begin{bmatrix}a & b\\b & d\\\end{bmatrix}$ and $B=\begin{bmatrix}e & f\\-f & e\\\end{bmatrix}$ to conclude that $A+B=\begin{bmatrix}a+e & b+f\\b-f & d+e\\\end{bmatrix}$ can not be in $S$ in general as $A+B$ is neither symmetric nor has the specified form in general,but it is possible for $A+B$ to be in $S$ only when $a=d,b=0$
So my conclusion is that $S$ can not be a subspace in general.
Now if $S$ is considered to be a subset of symmetric matrices then as OP proved, it is indeed a subspace.
(Please note that to prove scalar multiplication remains in $S$ we donot need symmetric nature,$AA^T=A^TA$ is sufficient to prove so)
To extend the discussion: If we consider $S$ such that every element of $S$ has that specified form
Then we see that if $A=\begin{bmatrix}a & b\\-b & a\\\end{bmatrix}$ and $A=\begin{bmatrix}c & d\\-d & c\\\end{bmatrix}$ then $A+B=\begin{bmatrix}x & y\\-y & x\\\end{bmatrix}$ where $x=a+c,y=b+d$ and then clearly $(A+B)^T(A+B)=(A+B)(A+B)^T=\begin{bmatrix}x^2+y^2 & 0\\0 & x^2+y^2\\\end{bmatrix}$ and thus $S$ will be a subspace.(scalar multiplication definitely remains in $S$ as usual)
Hope this helps.Please share any further suggestion/correction.Thank You.
Best Answer
You are given three matrices,
$ A = \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix}$
$ B = \begin{bmatrix} 1 & 1 \\ 1 & 0 \end{bmatrix}$
$ C = \begin{bmatrix} -2 & -2 \\ -2 & 1 \end{bmatrix}$
A very quick inspection gives $C = A - 3B$ so that $1A - 3B - 1C = 0$
Thus the three matrices are not linearly independent.
The vector space of symmetric 2 x 2 matrices has dimension 3, ie three linearly independent matrices are needed to form a basis.
The standard basis is defined by$ M = \begin{bmatrix} x & y \\ y & z \end{bmatrix} = x\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} + y\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} + z\begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}$
Clearly the given $A,B,C$ cannot be equivalent, having only two independent matrices.
So, why not?
Look for: Copying error? Did you mis-write or mis-type something?
Sign error?
Misunderstood question?
Is one of the other answers correct so this one then could be wrong?
And then after all other routes are exhausted, yes texts sometimes contain typos.