The notation I'm used to using with singular value decomposition is $$A=U\Sigma V^T,$$ so pardon the different notation I'll use here. ($\Sigma$ is the matrix of singular values, like what I assume $W$ is in your question; everything else is the same, aside from the transpose, which I denote by $\ ^T$.)
We start any singular value decomposition by calculating $A^T A$, which in this case is equal to $A^2$, because $A$ is symmetrical.
$$A^T A=A^2=\begin{bmatrix}0&1&3&3\\1&0&3&3\\3&3&0&1\\3&3&1&0\end{bmatrix}^2=\begin{bmatrix}19&18&6&6\\18&19&6&6\\6&6&19&18\\6&6&18&19\end{bmatrix}$$
We then find the eigenvalues and eigenvectors using the characteristic polynomial of $A^T A$, as follows:
$$\det(A-\lambda I_4)=\begin{vmatrix}19-\lambda&18&6&6\\18&19-\lambda&6&6\\6&6&19-\lambda&18\\6&6&18&19-\lambda\end{vmatrix}=$$
(skipping some tedious algebra that, I'll be honest, I used a website for)
$$\begin{align}\lambda^4-76\lambda^3+1374\lambda^2-2524\lambda+1225&=0\\(\lambda-1)(\lambda-1)(\lambda-25)(\lambda-49)&=0\end{align}$$ $$\lambda_1=1,\;\lambda_2=1,\;\lambda_3=25,\,\lambda_4=49$$
Now that we have the eigenvalues of $A^T A$, we know $\Sigma$, since it's defined to be the same shape as $A$ and to have entries only on the diagonal, which are given by $\sigma_i=\sqrt{\lambda_i}$: $$\bbox[5px,border:2px solid red]{\Sigma=\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&5&0\\0&0&0&7\end{bmatrix}}$$
We can now begin to calculate $V$, whose columns will be orthonormal eigenvectors of $A^T A$. There are quite a lot of row operations to be performed here, for which I used the same website. You can refer to it to check their validity if you so desire. The first two columns of $V$, $\vec{v_1}$ and $\vec{v_2}$, will compose the null space of the matrix $A^T A-\lambda_1 I_4$. To find them, we solve the following system and normalize the resulting basis vectors: $$A^T A-\lambda_1 I_4=\vec{0}$$ $$\begin{bmatrix}18&18&6&6\\18&18&6&6\\6&6&18&18\\6&6&18&18\end{bmatrix}\vec{v}=\begin{bmatrix}0\\0\\0\\0\end{bmatrix}$$
The general solution of this system tells us that $$\text{span}\left(\text{null}\left(A^T A-\lambda_1 I_4\right)\right)=\left\{\begin{bmatrix}0\\0\\-1\\1\end{bmatrix},\begin{bmatrix}-1\\1\\0\\0\end{bmatrix}\right\}.$$
Therefore, the first two columns of $V$ are $$\vec{v_1}=\frac{1}{\sqrt{2}}\begin{bmatrix}0\\0\\-1\\1\end{bmatrix},\;\vec{v_2}=\frac{1}{\sqrt{2}}\begin{bmatrix}-1\\1\\0\\0\end{bmatrix}.$$
Similarly, the normalized basis vector of the null space of $A^T A-\lambda_3 I_4$ will be $\vec{v_3}$, and the normalized basis vector of the null space of $A^T A-\lambda_4 I_4$ will be $\vec{v_4}$. $$A^T A-\lambda_3 I_4=\vec{0}$$
$$\begin{bmatrix}-6&18&6&6\\18&-6&6&6\\6&6&-6&18\\6&6&18&-6\end{bmatrix}\vec{v}=\begin{bmatrix}0\\0\\0\\0\end{bmatrix}$$
$$\implies \vec{v_3}=\frac{1}{2}\begin{bmatrix}-1\\-1\\1\\1\end{bmatrix}$$
$$A^T A-\lambda_4 I_4=\vec{0}$$
$$\begin{bmatrix}-30&18&6&6\\18&-30&6&6\\6&6&-30&18\\6&6&18&-30\end{bmatrix}\vec{v}=\begin{bmatrix}0\\0\\0\\0\end{bmatrix}$$
$$\implies \vec{v_4}=\frac{1}{2}\begin{bmatrix}1\\1\\1\\1\end{bmatrix}$$
Pulling together the the columns of $V$, we find that
$$V=\begin{bmatrix}0&-\frac{1}{\sqrt{2}}&-\frac{1}{2}&\frac{1}{2}\\0&\frac{1}{\sqrt{2}}&-\frac{1}{2}&\frac{1}{2}\\-\frac{1}{\sqrt{2}}&0&\frac{1}{2}&\frac{1}{2}\\\frac{1}{\sqrt{2}}&0&\frac{1}{2}&\frac{1}{2}\end{bmatrix}\implies \bbox[5px,border:2px solid red]{V^T=\begin{bmatrix}0&0&-\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}\\-\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}&0&0\\-\frac{1}{2}&-\frac{1}{2}&\frac{1}{2}&\frac{1}{2}\\\frac{1}{2}&\frac{1}{2}&\frac{1}{2}&\frac{1}{2}\end{bmatrix}}$$
Finally, now that we know the columns of $V$, we can calculate the columns of $U$, which are given by $\vec{u_i}=\frac{A\vec{v_i}}{\sigma_i}$.
$$\vec{u_1}=\left(\frac{1}{1}\right)\left(\frac{1}{\sqrt{2}}\right)\begin{bmatrix}0&1&3&3\\1&0&3&3\\3&3&0&1\\3&3&1&0\end{bmatrix}\begin{bmatrix}0\\0\\-1\\1\end{bmatrix}=\frac{1}{\sqrt{2}}\begin{bmatrix}0\\0\\1\\-1\end{bmatrix}$$
$$\vec{u_2}=\left(\frac{1}{1}\right)\left(\frac{1}{\sqrt{2}}\right)\begin{bmatrix}0&1&3&3\\1&0&3&3\\3&3&0&1\\3&3&1&0\end{bmatrix}\begin{bmatrix}-1\\1\\0\\0\end{bmatrix}=\frac{1}{\sqrt{2}}\begin{bmatrix}1\\-1\\0\\0\end{bmatrix}$$
$$\vec{u_3}=\left(\frac{1}{5}\right)\left(\frac{1}{2}\right)\begin{bmatrix}0&1&3&3\\1&0&3&3\\3&3&0&1\\3&3&1&0\end{bmatrix}\begin{bmatrix}-1\\-1\\1\\1\end{bmatrix}=\frac{1}{10}\begin{bmatrix}5\\5\\-5\\-5\end{bmatrix}=\frac{1}{2}\begin{bmatrix}1\\1\\-1\\-1\end{bmatrix}$$
$$\vec{u_4}=\left(\frac{1}{7}\right)\left(\frac{1}{2}\right)\begin{bmatrix}0&1&3&3\\1&0&3&3\\3&3&0&1\\3&3&1&0\end{bmatrix}\begin{bmatrix}1\\1\\1\\1\end{bmatrix}=\frac{1}{14}\begin{bmatrix}7\\7\\7\\7\end{bmatrix}=\frac{1}{2}\begin{bmatrix}1\\1\\1\\1\end{bmatrix}$$
$$\implies \bbox[5px,border:2px solid red]{U=\begin{bmatrix}0&\frac{1}{\sqrt{2}}&\frac{1}{2}&\frac{1}{2}\\0&-\frac{1}{\sqrt{2}}&\frac{1}{2}&\frac{1}{2}\\\frac{1}{\sqrt{2}}&0&-\frac{1}{2}&\frac{1}{2}\\-\frac{1}{\sqrt{2}}&0&-\frac{1}{2}&\frac{1}{2}\end{bmatrix}}$$
After all that work, we have the singular value decomposition of $A$:
$$A=U\Sigma V^T=\begin{bmatrix}0&\frac{1}{\sqrt{2}}&\frac{1}{2}&\frac{1}{2}\\0&-\frac{1}{\sqrt{2}}&\frac{1}{2}&\frac{1}{2}\\\frac{1}{\sqrt{2}}&0&-\frac{1}{2}&\frac{1}{2}\\-\frac{1}{\sqrt{2}}&0&-\frac{1}{2}&\frac{1}{2}\end{bmatrix} \begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&5&0\\0&0&0&7\end{bmatrix} \begin{bmatrix}0&0&-\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}\\-\frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}}&0&0\\-\frac{1}{2}&-\frac{1}{2}&\frac{1}{2}&\frac{1}{2}\\\frac{1}{2}&\frac{1}{2}&\frac{1}{2}&\frac{1}{2}\end{bmatrix}$$
First, correct a simple error
$$
\mathbf{A}^{*}\mathbf{A} =
\left[
\begin{array}{rr}
9 & -9 \\
-9 & 9 \\
\end{array}
\right].
$$
This system matrix $\mathbf{A}$ is a rank one matrix (column 2 = $-$column 1) with singular value decomposition
$$
\begin{align}
\mathbf{A}
&=
\mathbf{U}\, \Sigma \, \mathbf{V}^{*} \\
%
\left[
\begin{array}{rr}
1 & -1 \\
-2 & 2 \\
2 & -2 \\
\end{array}
\right]
&=
% U
\left[
\begin{array}{ccc}
\frac{1}{3}
\color{blue}{\left[
\begin{array}{r}
-1 \\
2 \\
-2
\end{array}
\right]}
&
\frac{1}{\sqrt{5}}
\color{red}{\left[
\begin{array}{r}
-2 \\
0 \\
1
\end{array}
\right]}
&
\frac{1}{3\sqrt{5}}
\color{red}{\left[
\begin{array}{r}
2 \\
5 \\
4
\end{array}
\right]}
%
\end{array}
\right]
% sigma
\left[
\begin{array}{cc}
3 \sqrt{2} & 0 \\
0 & 0 \\
0 & 0 \\
\end{array}
\right]
% V
\frac{1}{\sqrt{2}}
\left[
\begin{array}{rc}
\color{blue}{-1} & \color{blue}{1} \\
\color{red}{1} & \color{red}{1} \\
\end{array}
\right].
\end{align}
$$
Blue vectors are in range spaces, red vectors in null spaces.
The thin SVD uses the range space components only:
$$
\mathbf{A} =
% U
\frac{1}{3}
\color{blue}{\left[
\begin{array}{r}
-1 \\
2 \\
-2
\end{array}
\right]}
% S
\left( 3\sqrt{2} \right)
% V
\frac{1}{\sqrt{2}}
\left[
\begin{array}{rc}
\color{blue}{-1} & \color{blue}{1}
\end{array}
\right].
$$
You may benefit from this example:
SVD and the columns — I did this wrong but it seems that it still works, why?
Best Answer
In this problem $U$ is a $4 \times 4$ orthogonal matrix, so its columns are an orthonormal basis for $\mathbb{R}^4$. Since $A$ has rank $2$, the first two columns of $U$ are an orthonormal basis for $\DeclareMathOperator{\Col}{Col} \Col(A)$ (which is a $2$-dimensional subspace of $\mathbb{R}^4$), so we need to extend this to an orthonormal basis for $\mathbb{R}^4$. Since $\mathbb{R}^4 = \Col(A) \boxplus \Col(A)^\perp$, it suffices to find an orthonormal basis for $\Col(A)^\perp$.
Now $\DeclareMathOperator{\Nul}{Nul} \Col(A)^\perp = \Nul(A^T)$ by the definition of matrix multiplication in terms of dot products: $A^T x = 0$ iff $x$ is orthogonal to each of the rows of $A^T$ (which are the columns of $A$). We can compute a basis of $\Nul(A^T)$ (which will consist of two vectors) by row reduction, and then apply Gram-Schmidt to these vectors to get an orthonormal basis for $\Col(A)^\perp$.