First, the difference in the eigenvectors. Let $(\lambda,v)$ be an eigenpair of $A$, i.e., $A v = \lambda v$ and let $\alpha \in \mathbb{C} \setminus \{0\}$. Then
$$A (\alpha v) = \alpha A v = \alpha \lambda v = \lambda (\alpha v).$$
So, $v$ is an eigenvector of $A$ if and only if $\alpha v$ is an eigenvector of $A$. Both are equally "good", unless you desire some additional properties. Note that this works for any $A$, not just $A = C$.
Second, the significance of the left singular vectors is in computing the eigenvalue decomposition in $XX^T$ (in your notation: $X^T = X'$).
Third, a real diagonal matrix is orthogonal if and only if each of its diagonal elements is either $1$ or $-1$. Let us prove this.
Let $D = \mathop{\rm diag}(d_1,\dots,d_n)$. Obviously, $D = D^T$, so
$$D^TD = \mathop{\rm diag}(d_1^2,\dots,d_n^2).$$
So, $D^TD = {\rm I}$ if and only if $d_k^2 = 1$ for all $k$.
For complex matrices (and using complex adjoint $Q^*$ instead of transpose $Q^T$), we get that $|d_k| = 1$ for all $k$.
$M^+=V\Sigma^+U^\ast$ (i.e. $M^+=V\Sigma^+U^\top$ in the real case), where $\Sigma^+$ is a rectangular diagonal matrix whose size is identical to the size of $\Sigma^\top$. The $i$-th main diagonal entry of $\Sigma^+$ is $\sigma_i^{-1}$ if the $i$-th singular value $\sigma_i$ of $M$ is positive, otherwise the diagonal entry is zero.
You may simply prove that $M^+$ is indeed the Moore-Penrose pseudoinverse of $M$ by showing that it satisfies the four defining properties of Moore-Penrose pseudoinverse, namely, both $MM^+$ and $M^+M$ are Hermitian (or real symmetric in your case), $MM^+M=M$ and $M^+MM^+=M^+$.
Best Answer
In my view, a singular value decomposition just requires the diagonal entries of $\Sigma$ to be nonnegative, so $A^{-1} = V \Sigma^{-1} U^\top$ does indeed constitute a singular value decomposition. This is also the view Wikipedia takes, though some authors may adopt a different definition. If this is something you might see on a true/false question on an exam, I would certainly consult the reference being used in your particular course to see what their official definition of an SVD is.
The ordering of the singular values in descending order is a useful convention, which is worth enforcing if one does not have a good reason not to. Wanting to order the singular values in ascending order for the inverse of a matrix so the SVD of a matrix and its inverse is certainly a good reason to loosen this convention. In any kind of mathematical exposition, it is helpful to make explicit note of any of these convention issues to avoid possible confusion with the reader.
Certainly by reversing the order of the columns in $U$ and $V$ and the diagonal entries of $\Sigma$ to matrices $\tilde{U}$, $\tilde{V}$, and $\tilde{\Sigma}$, one easily converts the (reverse-ordered) SVD of $A^{-1} = V \Sigma^{-1} U^\top$ to an equivalent SVD $A^{-1} = \tilde{V} \tilde{\Sigma}^{-1} \tilde{U}^\top$ with singular values in descending order.