I can answer your first question, but the other two seem less reasonable to answer concisely.
It's possible to find such a matrix for any $n\ge 3$, and there are tons of them. Let's just choose the first column to be $w'=(0,1,\dots,1)^T$ and $w = w'/||w'||$. We are thus looking to complete $w$ to an orthonormal basis $w,v_1,\dots,v_{n-1}$ with no coordinate of any $v_i$ equal to $0$. The space of vectors orthogonal to $w$ is the space
$$W = \{(x_1,\dots,x_n)^T : x_2+\dots+x_n = 0\},$$
and we want to find an orthonormal basis of this space with no zero coordinates. That is, we want to find an orthonormal basis of this space which avoids the $n$ hyperplanes
$$A_i = \{(x_1,\dots,x_n)^T : x_i = 0\}.$$
The key use of having $n\ge 3$ is so that $W \cap A_i$ is $(n-2)$-dimensional for each $i$. When $n=2$, for example, $W \cap A_2$ is $1$-dimensional. Now, for each $i$, let $B_i = W \cap A_i$, an $(n-2)$-dimensional subspace of the $(n-1)$-dimensional space $W$. For each $i$, let $C_i = B_i^\perp \cap W$, a $1$-dimensional subspace of $W$ (and $1\le n-2$, remember). Now, since all the $B_i$ and $C_i$ have dimension smaller than that of $W$, their union cannot be $W$. Pick a unit vector $v_1$ in $W$ but not any $B_i$ or $C_i$ (you can pick it with norm $1$ because if $v$ is any vector in $W$ not in their union, then $\lambda v$ is also not in their union for $\lambda \ne 0$: if $\lambda v$ were in their union, then it would be in one of them in particular, and then so would $v=(1/\lambda)\lambda v$). Now let $W_1 = W\cap \{v_1\}^\perp$, which is $(n-2)$-dimensional. And for each $i$, let $B_i^1 = B_i \cap W_1$, which is $(n-3)$-dimensional, since we picked $v_1$ not in any $C_i$. Now we're looking for $n-2$ unit-norm vectors in $W_1$ not in any of the $B_i^1$, and we can iterate.
Let $T^*$ denote the adjoint to $T$ (i.e. the transformation corresponding to the transpose of the matrix of $T$). We find that $\langle Tu,w \rangle = \langle u,T^*w \rangle$ holds for all $u,w \in \Bbb R^a$. Note that in the usual approach, the SVD is proved as a consequence of the spectral theorem and the fact that $(T^*T)^* = T^*T$ (i.e. $T^*T$ is self-adjoint). However, given your unconventional approach to this problem (i.e. your decision not to simply read a textbook), I assume that you want to avoid this.
With that in mind, I begin with the following claim.
Claim: There exists a unit-vector $u$ for which $\|Tu\| = \max_{x \in \Bbb R^a,\|x\| = 1} \|Tx\|$.
This is a consequence of the fact that the unit-ball $\{x: \|x\| = 1\}$ is compact, and the function $f(x) = \|Tx\|$ is continuous. I now claim that as a consequence, it holds that for any $w \perp u$, it holds that $Tw \perp Tu$. In other words, $u$ satisfies the condition of the pseudo-orthogonal lemma.
Indeed, suppose for the purpose of contradiction that $w$ is a unit vector with $w \perp u$, but $\langle Tu,Tw\rangle \neq 0$. It follows that
\begin{align}
\| T(\cos \theta u + \sin \theta w)\|^2 &= \langle T(\cos \theta u + \sin \theta w), T(\cos \theta u + \sin \theta w)\rangle\\
&= \|Tu\|^2\cos^2\theta + \|Tw\|^2\sin^2 \theta + 2 \langle Tu, Tw\rangle \sin \theta \cos \theta
\\ & = \|Tw\|^2 + (\|Tu\|^2 - \|Tw\|^2)\cos^2 \theta + 2 \langle Tu, Tw\rangle \sin \theta \cos \theta
\\ & = a + b\cos^2 \theta + c \sin \theta \cos \theta
\\ & = a_0 + b_0 \cos(2\theta) + c_0 \sin(2 \theta),
\end{align}
where $c_0 \neq 0$. By the maximality of $\|Tu\|$, it should hold that the function
$$
f(\theta) = a_0 + b_0 \cos(2\theta) + c_0 \sin(2 \theta)
$$
attains a maximum at $\theta = 0$. However, we compute
$$
f'(\theta) = -2b_0\sin(2 \theta) + 2c_0 \cos(2\theta) \implies f'(0) = 2c_0 \neq 0,
$$
which means that $f$ does not attain a maximum at $\theta = 0$, which is a contradiction.
Best Answer
An interesting observation, but it doesn't pan out unfortunately! It already fails in dimension 3. Wikipedia has the following counterexample of a rotoinversion: $$ \begin{bmatrix} 0 & -0.8 & - 0.6 \\ 0.8 & -0.36 & 0.48 \\ 0.6&0.48&-0.64 \end{bmatrix} $$
(WolframAlpha agrees that this is indeed orthogonal, the example is from here)