You can factor a (not necessarily square) matrix as orthogonal times diagonal times orthogonal, and the diagonal entries need not all be non-negative. But multiplying a row or a column of an orthogonal matrix by $-1$ still gives an orthogonal matrix, and you can do that and change a minus to a plus in the diagonal matrix. In that way, the two orthogonal matrices can be chosen so that the diagonal entries in that matrix are all non-negative. Those are what are taken to be the singular values.
It's a convention to define it that way. But I suspect there are theorems that say that's the only way to define it that makes it have specified nice properties, and those theorems would not be mere conventions.
First, the difference in the eigenvectors. Let $(\lambda,v)$ be an eigenpair of $A$, i.e., $A v = \lambda v$ and let $\alpha \in \mathbb{C} \setminus \{0\}$. Then
$$A (\alpha v) = \alpha A v = \alpha \lambda v = \lambda (\alpha v).$$
So, $v$ is an eigenvector of $A$ if and only if $\alpha v$ is an eigenvector of $A$. Both are equally "good", unless you desire some additional properties. Note that this works for any $A$, not just $A = C$.
Second, the significance of the left singular vectors is in computing the eigenvalue decomposition in $XX^T$ (in your notation: $X^T = X'$).
Third, a real diagonal matrix is orthogonal if and only if each of its diagonal elements is either $1$ or $-1$. Let us prove this.
Let $D = \mathop{\rm diag}(d_1,\dots,d_n)$. Obviously, $D = D^T$, so
$$D^TD = \mathop{\rm diag}(d_1^2,\dots,d_n^2).$$
So, $D^TD = {\rm I}$ if and only if $d_k^2 = 1$ for all $k$.
For complex matrices (and using complex adjoint $Q^*$ instead of transpose $Q^T$), we get that $|d_k| = 1$ for all $k$.
Best Answer
Let $A=UDU^*$ be the orthogonal diagonalization, where $$ D = \mathrm{diag}(s_1,\dots,s_k,s_{k+1},\dots,s_n) $$ with $s_1,\dots,s_k\geq 0$ and $s_{k+1},\dots,s_n<0$.
Let $V$ be the matrix with the same firs $k$ columns as $U$ and the last $n-k$ columns which are the opposite as those of $U$: $$ V=(u_1,\dots,u_k,-u_{k+1},\dots,-u_n), $$ where $U=(u_1,\dots,u_n)$. Moreover, let $$ \Sigma = \mathrm{diag}(s_1,\dots,s_k,-s_{k+1},\dots,-s_n). $$ Then $V$ is also orthogonal and $A=U\Sigma V^*$ is the SVD of $A$.