Two sets of vectors in the same vector space, $S_1$ and $S_2$, span the same subspace if and only if:
- Each vector in $S_1$ can be written as a linear combination of the vectors in $S_2$; and
- Each vector in $S_2$ can be written as a linear combination of the vectors in $S_1$.
There may be ways of inferring these properties without actually showing them directly (such as dimension arguments like user6312 suggests), but it really ends up coming down to showing this holds or that this cannot hold.
Note that these conditions are not intrinsic: it's almost never enough to just look at $S_1$ without thinking about $S_2$, and to then look at $S_2$ without looking at $S_1$; it is only in very extreme circumstances that this suffices (when you can prove that the "sizes" of the spans don't match, a dimension argument) that this can prove to be enough.
So just figuring out if the sets are linearly dependent or independent is not enough in this case.
So: are $(-2,-6,0)$ and $(1,1,-2)$ each linear combinations of $(1,2,-1)$, $(0,1,1)$, and $(2,5,-1)$? Yes: you can try solving the two systems of linear equations:
$$\begin{align*}
\alpha_1 \left(\begin{array}{r}1\\2\\-1\end{array}\right) + \beta_1\left(\begin{array}{c}0\\1\\1\end{array}\right) + \gamma_1\left(\begin{array}{r}2\\5\\-1\end{array}\right) &= \left(\begin{array}{r}-2\\-6\\0\end{array}\right)\\
\alpha_2 \left(\begin{array}{r}1\\2\\-1\end{array}\right) + \beta_2\left(\begin{array}{c}0\\1\\1\end{array}\right) + \gamma_2\left(\begin{array}{r}2\\5\\-1\end{array}\right) &= \left(\begin{array}{r}1\\1\\-2\end{array}\right)
\end{align*}$$
and see if they each have solutions. (It can even be done both at once, by doing row reduction on
$$\left(\begin{array}{rrr|rr}
1 & 0 & 2 & -2 & 1\\
2 & 1 & 5 & -6 & 1\\
-1 & 1 & -1 & 0 & -2
\end{array}\right).$$
If either system has no solutions, then you know that not every vector in $S_2$ is in the span of $S_1$ and you are done; if both systems have solutions, then every vector in $S_2$ is in the span of $S_1$, so $\mathrm{span}(S_2)\subseteq \mathrm{span}(S_1)$.
Then you need to see if the converse inclusion holds: is every vector in $S_1$ a linear combination of the vectors in $S_2$? That is, can we solve the three systems of linear equations?
$$\begin{align*}
\rho_1\left(\begin{array}{r}-2\\-6\\0\end{array}\right) + \sigma_1 \left(\begin{array}{r}1\\1\\-2\end{array}\right) &= \left(\begin{array}{r}1\\2\\-1\end{array}\right)\\
\rho_2\left(\begin{array}{r}-2\\-6\\0\end{array}\right) + \sigma_2 \left(\begin{array}{r}1\\1\\-2\end{array}\right) &= \left(\begin{array}{r}0\\1\\1\end{array}\right)\\
\rho_3\left(\begin{array}{r}-2\\-6\\0\end{array}\right) + \sigma_3 \left(\begin{array}{r}1\\1\\-2\end{array}\right) &= \left(\begin{array}{r}2\\5\\-1\end{array}\right)
\end{align*}$$
If we can solve all, then each vector in $S_1$ is in the span of $S_2$, so $\mathrm{span}(S_1)\subseteq \mathrm{span}(S_2)$; together with the previous
inclusion, this would show the spans are equal. If some equation cannot be solved, then not every vector in $S_1$ is in the span of $S_2$, so the spans are different.
(There is one thing that you can rescue from your efforts: since you proved that the set $S_1$ is linearly dependent, you can extract from it a linearly independent set (in this case, for instance, the first two vectors), and replace $S_1$ with that set of two vectors (because the third vector is a linear combination of the first two). That will mean that checking that "every vector in $S_1$ is a linear combination of the vectors in $S_2$" and checking that "every vector in $S_2$ is a linear combination of the vectors in $S_1$" will be simpler: instead of checking five things, you only need to check four.)
Most introductory books on Linear Algebra have a Theorem which says something like
Let $A$ be a square $n \times n$ matrix. Then the following are equivalent:
What does this mean, it simply means that if you want to check if any of these conditions is true or false, you can simply pick whichever other condition from the list and check it instead..
Your question is: Can instead of third or fourth condition, check the second? That's exactly what the Theorem says: YES.
Best Answer
Yes, it does. If it didn't, there'd be a vector $b$ spanned by $\{b_1,\dotsc,b_n\}$ independent of $\{c_1,\dotsc,c_n\}$, so $\{c_1,\dotsc,c_n,b\}$ would span an $(n+1)$-dimensional space, contradicting the fact that $\{b_1,\dotsc,b_n\}$ spans that space.