Those are not clear statements of dependence and independence of the set$\{v_1,\dots v_n\}$.
The set would be dependent if:
There exist $\alpha_i$, not all of which are zero, such that $\sum \alpha_iv_i=0$
The set would be independent if it satisfies the negation, that there will not be such a set of nonzero coefficients:
If $\sum \alpha_iv_i=0$, then all the $\alpha_i=0$
If all the $\alpha_i=0$ then $\sum \alpha_iv_i=0$ holds all the time, so it is not interesting! It is a special case that always works. A linearly independent set is special precisely because you can't get a combination to add up to zero unless you use all zeros (which will always work.)
In any number of dimensions, linear independence expresses the idea that one vector is not in the span of the other vectors.
For example, if $\sum \alpha_iv_i=0$ where at least one of the alphas is nonzero, (say for convenience, $\alpha_1$) then $v_1=\sum_{i=2}^n \alpha_1^{-1}\alpha_iv_i$, and so $v_1$ is generatable by $v_2\dots v_n$. Then we could just throw $v_1$ out, since we know the other $v_i$ can already generate it.
So when a set is linearly independent, it means that each member really does contribute to the vector space they generate. Each element adds something new that can't be produced by the other vectors.
The definition of linear independence says you can't make 0 out of a linear combination. It says nothing about not being able to make any other vector out of linear combinations.
(1,0) and (0,1) are independent since you cannot write (0,0) = c(1,0) + d(0,1) without c=d=0. But you can write every other vector as a nontrivial linear combination of these. (2,3) = 2(1,0)+3(0,1) for example. Spend some time making sense of the definitions with some concrete examples like this one and it will make sense eventually.
If you call your orthogonal set $\{v_1, v_2, \dots, v_n\}$, you can trivially write any vector in your set as a linear combination (take all coefficients $0$ except the coefficient of $v_k$ which is $1$).
$v_k = 0\cdot v_1+0\cdot v_2+\dots+0\cdot v_{k-1}+1\cdot v_k+0\cdot v_{k+1}+\dots+0\cdot v_n$
This is true of any set, whether it is orthogonal or not.
Moreover, any vector in the span of $\{v_1, v_2, \dots, v_n\}$ can be written as a linear combination of these vectors. This is again true of any set, whether orthogonal or not.
Best Answer
To answer your first question, no. A linear combination of linearly independent vectors uniquely describes a given vector. This is because two vectors are equal iff their components (i.e. the coefficients on whatever unit vectors you are using) are equal.
For your second question, the answer is also no. There is nothing wrong with having some coefficients as zero. For instance, representing the unit vectors (whatever you define them as) in your system requires each of the other coefficients to be zero.