Let's say the matrix has $m$ rows and $n$ columns. Either $m < n$, $m > n$, or $m = n$.
If $m < n$, then we have $n$ columns which lie in $\mathbb{R}^m$. Since $\mathbb{R}^m$ has dimension $m$, we can't have more than $m$ linearly independent vectors in $\mathbb{R}^m$. So the $n$ columns must be linearly dependent, a contradiction. Thus, we cannot have $m < n$.
If $m > n$, then we have $m$ rows which lie in $\mathbb{R}^n$. Since $\mathbb{R}^n$ has dimension $n$, we can't have more than $n$ linearly independent vectors in $\mathbb{R}^n$. So the $m$ rows must be linearly dependent, a contradiction. Thus, we cannot have $m > n$.
The only remaining possibility is $m = n$, which means that the matrix must be square.
The definition of linear independence says you can't make 0 out of a linear combination. It says nothing about not being able to make any other vector out of linear combinations.
(1,0) and (0,1) are independent since you cannot write (0,0) = c(1,0) + d(0,1) without c=d=0. But you can write every other vector as a nontrivial linear combination of these. (2,3) = 2(1,0)+3(0,1) for example. Spend some time making sense of the definitions with some concrete examples like this one and it will make sense eventually.
If you call your orthogonal set $\{v_1, v_2, \dots, v_n\}$, you can trivially write any vector in your set as a linear combination (take all coefficients $0$ except the coefficient of $v_k$ which is $1$).
$v_k = 0\cdot v_1+0\cdot v_2+\dots+0\cdot v_{k-1}+1\cdot v_k+0\cdot v_{k+1}+\dots+0\cdot v_n$
This is true of any set, whether it is orthogonal or not.
Moreover, any vector in the span of $\{v_1, v_2, \dots, v_n\}$ can be written as a linear combination of these vectors. This is again true of any set, whether orthogonal or not.
Best Answer
It depends on the definition of an “orthogonal” system. More explicitly, on whether such a system is allowed to contain the zero vector.
If an orthogonal system is not allowed to contain the zero vector, then your counter-example does not work, and the given statement is true. (Assuming that the question is talking about orthogonality with respect to the standard inner product on $ℝ^{3^n}$.)
If an orthogonal system is allowed to contain the zero vector, then your solution is correct and the statement false.
The definition of an “orthogonal set” from Wolfram MathWorld does not exclude the zero vector, so according to this convention your solution is correct.
In the context of an exam question I’d assume this to be a trick question meant to punish students who apply $$ \text{orthogonal} \implies \text{linearly independent} $$ without checking the neccesary assumption of excluding the zero vector. I don’t see how the exponent $3^n$ is relevant to the problem; I imagine that it’s only there to provide the students with some extra confusion, or even to distract them from the actual trap behind the statement (the zero vector).