If you know the determinant is multilinear (linear in each row/linear in each column), then express the the determinant as a sum of determinants of matrices that have repeated rows/columns. Then show that the determinant of a matrix with two identical rows/columns is always zero.
If you do not know the determinant is multilinear in the rows/columns, and you know the definition in terms of sums of products over all permutations, then the same argument works to show the determinant is a sum of determinants of matrices with repeated rows/columns.
If your only definition of determinant is inductively (by minors), then prove it by induction, expanding along a row or column that is not a linear combination of other rows/columns.
Edit: You've edited your question, and now the question is the precise converse of what you originally posted. This is a bit of bad form, since any answers already posted would automatically look completely wrong-headed. In the future, consider editing to add the new question to the old, indicating this is an edit or addition.
So, now you want to show that if the determinant is zero, then a row/column is a linear combination of other rows/columns. Looking at the tranpose if necessary it suffices to consider the case of rows. Note that applying elementary row operations does not change whether the determinant is zero or nonzero (an elementary row operation will either multiply the determinant by a nonzero constant, or will not change the determinant). So the determinant of the matrix is zero if and only if the determinant of its row-echelon form is zero; the determinant of the row-echelon form is zero if and only if there is a row of zeros. Tracing back the elementary row operations will express the original row as a linar combination of the other rows.
(Just so this question has an answer.)
As fkraiem points out, a permutation matrix is a matrix with precisely one $1$ in each row and precisely one $1$ in each column, and zeroes elsewhere.
Your argument is correct as is your conclusion. Furthermore, $n\times n$ permutation matrices are in one-to-one correspondence with $S_n$, the group of permutations on $n$ elements (hence the name). The determinant of a permutation matrix is the sign of the corresponding permutation.
Best Answer
As described by examples in J.M.'s answer and Paul's comment to Martin's answer, it is possible to get zero and $\pm2$. But that is not all of it. Consider matrices of the form (thanks to Martin for pointing out that I need to use an odd size) $$ A=\pmatrix{P&0&0\cr0&P&0\cr0&0&P\cr}, $$ where $P$ is the matrix from Paul's comment $$ P=\pmatrix{1&1&0\cr1&0&1\cr0&1&1\cr}. $$ Obviously $\det A=-8$, and it is clear that we can get any odd power of two in this way. I believe this covers all the possibilities: $0,\pm2^k$ for some odd natural number $k>0$.
Edit: Here's a proof that $\pm 2^{2t+1}, t\in\mathbf{N},$ and $0$ are all the possibilities. Do the following: Pick one of the 1s on row number $i_1$. Check the location of the other 1 on that row. Let the other 1 on that column be on row $i_2$. Keep doing this horizontal-vertical motion to define a set of rows $i_3,\ldots$. Sooner or later you get back to row number $i_1$. We have thus identified a subset of rows $i_1,\ldots,i_k$ for some integers $k>1$. By reordering the rows, we can bring these to the top. By reordering the columns we get a block looking exactly like the matrices in J.M.'s answer in the top-left corner. That block has determinant $2$ or $0$ as described by J.M. - according to the parity of $k$.
We may or may not have exhausted all the rows of the matrix. If we have not covered all of it, we build another block in a similar fashion. In the end we will have an odd number of blocks with an odd number of rows. The parity of the necessary row/column permutations gives us the sign.