Two sets of vectors in the same vector space, $S_1$ and $S_2$, span the same subspace if and only if:
- Each vector in $S_1$ can be written as a linear combination of the vectors in $S_2$; and
- Each vector in $S_2$ can be written as a linear combination of the vectors in $S_1$.
There may be ways of inferring these properties without actually showing them directly (such as dimension arguments like user6312 suggests), but it really ends up coming down to showing this holds or that this cannot hold.
Note that these conditions are not intrinsic: it's almost never enough to just look at $S_1$ without thinking about $S_2$, and to then look at $S_2$ without looking at $S_1$; it is only in very extreme circumstances that this suffices (when you can prove that the "sizes" of the spans don't match, a dimension argument) that this can prove to be enough.
So just figuring out if the sets are linearly dependent or independent is not enough in this case.
So: are $(-2,-6,0)$ and $(1,1,-2)$ each linear combinations of $(1,2,-1)$, $(0,1,1)$, and $(2,5,-1)$? Yes: you can try solving the two systems of linear equations:
$$\begin{align*}
\alpha_1 \left(\begin{array}{r}1\\2\\-1\end{array}\right) + \beta_1\left(\begin{array}{c}0\\1\\1\end{array}\right) + \gamma_1\left(\begin{array}{r}2\\5\\-1\end{array}\right) &= \left(\begin{array}{r}-2\\-6\\0\end{array}\right)\\
\alpha_2 \left(\begin{array}{r}1\\2\\-1\end{array}\right) + \beta_2\left(\begin{array}{c}0\\1\\1\end{array}\right) + \gamma_2\left(\begin{array}{r}2\\5\\-1\end{array}\right) &= \left(\begin{array}{r}1\\1\\-2\end{array}\right)
\end{align*}$$
and see if they each have solutions. (It can even be done both at once, by doing row reduction on
$$\left(\begin{array}{rrr|rr}
1 & 0 & 2 & -2 & 1\\
2 & 1 & 5 & -6 & 1\\
-1 & 1 & -1 & 0 & -2
\end{array}\right).$$
If either system has no solutions, then you know that not every vector in $S_2$ is in the span of $S_1$ and you are done; if both systems have solutions, then every vector in $S_2$ is in the span of $S_1$, so $\mathrm{span}(S_2)\subseteq \mathrm{span}(S_1)$.
Then you need to see if the converse inclusion holds: is every vector in $S_1$ a linear combination of the vectors in $S_2$? That is, can we solve the three systems of linear equations?
$$\begin{align*}
\rho_1\left(\begin{array}{r}-2\\-6\\0\end{array}\right) + \sigma_1 \left(\begin{array}{r}1\\1\\-2\end{array}\right) &= \left(\begin{array}{r}1\\2\\-1\end{array}\right)\\
\rho_2\left(\begin{array}{r}-2\\-6\\0\end{array}\right) + \sigma_2 \left(\begin{array}{r}1\\1\\-2\end{array}\right) &= \left(\begin{array}{r}0\\1\\1\end{array}\right)\\
\rho_3\left(\begin{array}{r}-2\\-6\\0\end{array}\right) + \sigma_3 \left(\begin{array}{r}1\\1\\-2\end{array}\right) &= \left(\begin{array}{r}2\\5\\-1\end{array}\right)
\end{align*}$$
If we can solve all, then each vector in $S_1$ is in the span of $S_2$, so $\mathrm{span}(S_1)\subseteq \mathrm{span}(S_2)$; together with the previous
inclusion, this would show the spans are equal. If some equation cannot be solved, then not every vector in $S_1$ is in the span of $S_2$, so the spans are different.
(There is one thing that you can rescue from your efforts: since you proved that the set $S_1$ is linearly dependent, you can extract from it a linearly independent set (in this case, for instance, the first two vectors), and replace $S_1$ with that set of two vectors (because the third vector is a linear combination of the first two). That will mean that checking that "every vector in $S_1$ is a linear combination of the vectors in $S_2$" and checking that "every vector in $S_2$ is a linear combination of the vectors in $S_1$" will be simpler: instead of checking five things, you only need to check four.)
Nonparallel is probably an inappropriate notion to think about: you should have in mind that three vectors could be nonparallel but still linearly dependent from each other. That is, the fact that they are pairwise nonparallel does not imply that two of them don't generate the third. This is what happens in the example of the edit with $S$: generate the third vector by summing the second to twice the first.
This shows that $S$ is not a basis, in particular it spans a subspace of dimension 2. Hence, it is possible to find vectors which are not generated by elements in $S$.
The natural question is: being nonparallel is not good but it is easy to check. Is there a way to check linear independence of vectors? And the answer is: build a matrix whose columns are the vectors you're testing for linear independence. If the determinant of the matrix is not zero, they're linearly independent. If you do this with $S$ you'll get a 0 determinant.
Clearly, your vectors being a basis is only a sufficient condition for a vector to lie in their span. If they're not a basis, you could find vectors outside their span or inside, you have to check that. And that's done as you said, finding coefficients $c_i$
Best Answer
If $M(A)$ is a matrix with the vectors of $A$ as columns and $M(AB)$ is the matrix with the vectors of both $A$ and $B$ as columns, then $span(B) \subset span(A)$ if $rank(M(A))=rank(M(AB))$. Rank is after all the dimension of the column space of a matrix.