The answer is no, these will not be independent. Consider the example where $N = 2$, then your method 2 is equivalent to:
- Choose $v_1 \sim \text{Unif}[0,1]$.
- Set $v_2 = 1-v_1$.
Note that $v_2$ is itself uniformly distributed on $[0,1]$.
Now if $v_1,\,v_2$ were independent, we would have for all $s,t \in [0,1]$
$$ \mathbf P[ v_1 \leq s, v_2 \leq t] = \mathbf P[v_1 \leq s] \mathbf P[v_2 \leq t] = st.$$
However, since $v_2 = 1 - v_1$ we actually have
$$\mathbf P[v_1 \leq s, (1-v_1) \leq t] = \mathbf P[v_1 \leq s, v_1 \geq 1-t] = \mathbf P[(1-t) \leq v_1 \leq s].$$
The exact formula of the final expression depends on the values of $s,t \in [0,1]$, but as an example if $s = t = 1/2$ then
$$
\mathbf P[(1-t) \leq v_1 \leq s] =\mathbf P[ v_1 = 1/2] = 0 \neq \frac14,
$$
where $\frac14$ is the answer you would expect for independent $v_1,v_2$.
Best Answer
If you sample elements from a uniform distribtution over $[-1,1]$ and apply the Gram Schmidt procedure, you can generate every possible orthogonal matrix (note that orthogonal matrices necessarily have elements within $[-1,1]$). However, I don't believe that it will generate all matrices with equal probability.
See this paper for further discussion, and a method that produces a uniformly random unitary matrix.