First, the difference in the eigenvectors. Let $(\lambda,v)$ be an eigenpair of $A$, i.e., $A v = \lambda v$ and let $\alpha \in \mathbb{C} \setminus \{0\}$. Then
$$A (\alpha v) = \alpha A v = \alpha \lambda v = \lambda (\alpha v).$$
So, $v$ is an eigenvector of $A$ if and only if $\alpha v$ is an eigenvector of $A$. Both are equally "good", unless you desire some additional properties. Note that this works for any $A$, not just $A = C$.
Second, the significance of the left singular vectors is in computing the eigenvalue decomposition in $XX^T$ (in your notation: $X^T = X'$).
Third, a real diagonal matrix is orthogonal if and only if each of its diagonal elements is either $1$ or $-1$. Let us prove this.
Let $D = \mathop{\rm diag}(d_1,\dots,d_n)$. Obviously, $D = D^T$, so
$$D^TD = \mathop{\rm diag}(d_1^2,\dots,d_n^2).$$
So, $D^TD = {\rm I}$ if and only if $d_k^2 = 1$ for all $k$.
For complex matrices (and using complex adjoint $Q^*$ instead of transpose $Q^T$), we get that $|d_k| = 1$ for all $k$.
$M^+=V\Sigma^+U^\ast$ (i.e. $M^+=V\Sigma^+U^\top$ in the real case), where $\Sigma^+$ is a rectangular diagonal matrix whose size is identical to the size of $\Sigma^\top$. The $i$-th main diagonal entry of $\Sigma^+$ is $\sigma_i^{-1}$ if the $i$-th singular value $\sigma_i$ of $M$ is positive, otherwise the diagonal entry is zero.
You may simply prove that $M^+$ is indeed the Moore-Penrose pseudoinverse of $M$ by showing that it satisfies the four defining properties of Moore-Penrose pseudoinverse, namely, both $MM^+$ and $M^+M$ are Hermitian (or real symmetric in your case), $MM^+M=M$ and $M^+MM^+=M^+$.
Best Answer
By writing $A^{-1}$ you are assuming the inverse of $A$ exists. Thus, $A$ must be square and $U,\Sigma,V$ must all be square invertible matrices. Hence $A^{-1} = (U\Sigma V^\top)^{-1} = (V^\top)^{-1}\Sigma^{-1}U^{-1}$. Note also that, in singular value decomposition, both $U$ and $V$ are orthogonal matrices and hence $U^{-1}=U^\top$ and $V^{-1}=V^{\top}$. We then get $A^{-1}=V\Sigma^{-1}U^\top$.