The singular values of a matrix $A$ (not necessarily square) are the square roots of the eigenvalues of $A^TA$. Squaring $S$ will NOT give the eigenvalues of $A$, but of $A^TA$. Additionally, the singular vectors (the columns of $U$ and $V$) are the eigenvectors of $A^TA$ and $AA^T$. A priori, there is no direct connection between the eigenvalues of $A$ and the singular values of $A$, or between eigenvectors and singular vectors.
Of course, there are some relations. Given a diagonal matrix, it scales the length of a vector by at most the largest entry in the matrix. Because orthogonal matrices preserve length, the largest singular value is the largest amount by which the length of a vector can be scaled. If $A$ is symmetric, the same description holds of the largest eigenvalue. However, in general, if $\lambda$ is the largest eigenvalue of $A$, then in general $|\lambda|$ will be smaller than the largest singular value. For example, with the matrix
$$ \pmatrix{1 & 2\\ 3 & 4}$$
the eigenvalues are $5.372$ and $-.372$, while the singular values are $5.465$ and $.366$.
If $A$ is symmetric, the eigen-decomposition is an SVD decomposition. Unfortunately, I know of no good way to go from one decomposition to another in general.
There is a nice relationship for normal matrices, that is matrices which satisfy $AA*=A*A$. (or in the case of real matrices, $AA^T=A^TA$). A matrix is normal if and only if it can be diagonalized by a unitary matrix, that is $A=USU^*$ where $UU^*=I$ and $S$ is diagonal. In this case, $AA^*=(USU^*)(US^*U^*)=U(SS^*)U^*$. In this case, $SS^*$ will be diagonal and will contain the squares of the lengths of the eigenvalues of $A$, and therefore the singular values of $A$ are in fact the absolute values of the eigenvalues of $A$.
Unfortunately, this makes it easy to go from an eigenvalue decomposition of $A$ to an SVD, but not the other way around.
let us call $$A- 2I = B = \left(\begin{array}{ccc}-2 & -3 & -2 \\1 & 1 & 1 \\1 & 2 & 1\end{array}\right).$$ row reducing we find that $B \to \pmatrix{1&0&1\\0&1&0\\0&0&0}$ so that the null of $B$ has dimension one and $$u = \pmatrix{1\\0\\-1}, Bu= 0$$ is a basis.
you can also find that null of $B^2$ has dimension $2.$ let $v$ solve $Bv = u.$ we will find $v$ by row reducing the augmented matrix $[B|u].$ we find that $$[B|u] \to \pmatrix{1&0&1&1\\0&1&0&-1\\0&0&0&0}$$ and $$v = \pmatrix{1\\-1\\0}, Bv = u$$
now, we solve $Bw = v$ and find $$[B|v] \to \pmatrix{1&0&1&-2\\0&1&0&1\\0&0&0&0}$$ and $$w = \pmatrix{-2\\1\\0}, Bw = v$$
you can verify that $\{u, v, w\}$ is a basis and with respect to this basis $B, A = B+2I$ have the representations $$ \pmatrix{0&1&0\\0&0&1\\0&0&0}, \pmatrix{2&1&0\\0&2&1\\0&0&2}$$
Best Answer
You have a matrix composed by two 2x2 diagonals blocks. You can compute the exponential of the blocks separaterly. The blocks themselves are of the form $I+N$ and $-I+M$ where $N$ and $M$ are nilpotent ($N^2=0$, $M^2=0$). So: $$ e^{N+I} =e^Ne^I = (I+N)e^I, \qquad e^{M-I} = e^M e^{-I}=(I+M)e^{-I}. $$
Matrix exponential can be computed blockwise because the exponential is a sum of powers, and both sums and products can be computed blockwise. The exponential of a square free matrix $N$ is $I+N$ since all higher powers: $N^2$, $N^3$... in the sum: $e^N = I + N + N^2/2 + N^3/3! + ...$ are null. Clearly $I$ commutes with every matrix, hence $\exp(N+I) = e^Ne^I$. The same is true for $-I$ which is a multiple of $I$.
Specifically: $$ \exp\begin{pmatrix}1&2\\0&1\end{pmatrix} = \begin{pmatrix}1&2\\0&1\end{pmatrix}\begin{pmatrix}e&0\\0&e\end{pmatrix} =\begin{pmatrix}e&2e\\0&e\end{pmatrix} $$ while $$ \exp\begin{pmatrix}-1&0\\1&-1\end{pmatrix} = \begin{pmatrix}1&0\\1&1\end{pmatrix}\begin{pmatrix}1/e&0\\0&1/e\end{pmatrix} =\begin{pmatrix}1/e & 0\\1/e & 1/e\end{pmatrix} $$ Hence $$ e^A = \begin{pmatrix}e&2e&0&0\\ 0&e&0&0\\ 0&0&1/e&0\\ 0&0&1/e&1/e\end{pmatrix} $$