In the future, please limit yourself to one problem in your post. You're more likely to get helpful responses than if you list several problems.
For #1, you have given an argument for the "if" direction, but you haven't addressed the "only if" direction. You must show that if $S$ is linearly dependent, then one of the two cases stated holds.
For #2, you should take a linear combination of the columns and argue that each of the coefficients is zero. Your argument about directions is not valid. You can select three vectors in $\mathbb{R}^2$ that all have different directions. Those three vectors will be linearly dependent (why?).
For #3, your definitely need to say something for the other direction. Otherwise, you will have only proven the "only if" direction. For the "if" direction, you assume that $\{u+v, u-v\}$ are linearly independent (i.e. $b_1 (u+v) + b_2(u-v) = 0 \Rightarrow b_1 = b_2 = 0$. Then, to prove that $\{u, v\}$ is linearly independent, you assume that $a_1 u + a_2 v = 0$ and must show that $a_1 = a_2 = 0$. Your proof will be similar to the other direction, using an algebraic calculation to get equations on $a_1$ and $a_2$ whose only solution is $a_1 = a_2 = 0$.
Hope that helps. Keep practicing and asking questions - that's the best way to learn how to do proofs.
In questions like this, if you're still new to doing mathematics in general, I think it's really important to be really systematic:
In your case, $\{v_i\}_{1\leq i\leq n}$ is a basis for $V$. This means that $a)$, $\{v_i\}_{1\leq i\leq n}$ is a spanning set and $b)$, $\{v_i\}_{1\leq i\leq n}$ is a linearly independent.
So... given some $v\in V$, we need to show that it has a unique representation as a linear combination of $v_i$'s. So first, we need to show that it has such a representation. This follows from the fact that $\{v_i\}_{1\leq i\leq n}$ is spanning - this is literally just the definition of being spanning, so here, it is alright to say that it's obvious.
Next, we need to show uniqueness, and as you indicated, this should follow from linear independence. However, in order to do this, we need to reduce the question to a question of linear independence.
So, assume that $v=\sum_{i=1}^n \alpha_iv_i=\sum_{i=1}^n \beta_i v_i$. Then, we see that $0=\sum_{i=1}^n (\alpha_i-\beta_i)v_i$ and by definition of linear independence, this means that $\alpha_i-\beta_i=0$ for all $i$. This implies the desired.
The point of this type of exercise is typically to practice proof techniques, and the important thing, in my experience, is just to attempt to be completely systematic and never claim that something is just obvious (even though it may very well be).
Best Answer
First of all I think question number 1 has a mistake, probably you want to prove that $\{1,x-1,\dots,x^{n}-x^{n-1}\}$ is a basis for $P_{n}$. Now, for that question I hope you know that $B:=\{1,x,x^{2},\dots,x^{n}\}$ is a basis for $P_{n}$ (if not, it easy to prove), then your question is easy to prove using elementary facts on linear algebra because you only must do linear combinations between vectors of the basis B. Question 2 it's a little harder, but I suppose that reductio ad absurdum would be a good way to start. Question 3 follows from the isomorphism between $M_{2x3}(R)$ and $R^{2x3}$. Generally proofs on linear algebra don't have and standard way to solve. Obviously, most of the poofs refering to prove linear independent usually works by using reductio ad absurdum