First, the difference in the eigenvectors. Let $(\lambda,v)$ be an eigenpair of $A$, i.e., $A v = \lambda v$ and let $\alpha \in \mathbb{C} \setminus \{0\}$. Then
$$A (\alpha v) = \alpha A v = \alpha \lambda v = \lambda (\alpha v).$$
So, $v$ is an eigenvector of $A$ if and only if $\alpha v$ is an eigenvector of $A$. Both are equally "good", unless you desire some additional properties. Note that this works for any $A$, not just $A = C$.
Second, the significance of the left singular vectors is in computing the eigenvalue decomposition in $XX^T$ (in your notation: $X^T = X'$).
Third, a real diagonal matrix is orthogonal if and only if each of its diagonal elements is either $1$ or $-1$. Let us prove this.
Let $D = \mathop{\rm diag}(d_1,\dots,d_n)$. Obviously, $D = D^T$, so
$$D^TD = \mathop{\rm diag}(d_1^2,\dots,d_n^2).$$
So, $D^TD = {\rm I}$ if and only if $d_k^2 = 1$ for all $k$.
For complex matrices (and using complex adjoint $Q^*$ instead of transpose $Q^T$), we get that $|d_k| = 1$ for all $k$.
Let $\lambda_1 \geq \dots \geq \lambda_m \geq 0$ be the eigenvalues of the positive semidefinite matrix $A^TA$ so $\sigma_i = \sqrt{\lambda_i}$. Consider
$$ B^T B = \left[ \begin{matrix} A^T & I_m^T \end{matrix} \right] \left[ \begin{matrix} A \\ I_m \end{matrix} \right] = A^TA + I_m. $$
Thus, the eigenvalues of $A^TA + I_m$ are $\lambda_i + 1$ and so the singular values of $B$ are $\sqrt{\lambda_i + 1} = \sqrt{\sigma_i^2 + 1}$.
Best Answer
It is possible to make the matrices smaller, yes. But then $U\Sigma V^T$ wouldn't be the SVD of $A$.
If $A$ is singular, then some of the diagonal entries of $\Sigma$ are zero, and could, in principle, be cut out, along with the corresponding columns of $U$ and $V$. But that's not the true SVD. Also, $A$ is assumed to be invertible, and a simple rank argument then shows that $U, \Sigma$ and $V$ must all be at least $n\times n$.