Summary
Computing the full form of the singular value decomposition (SVD) will generate a set of orthonormal basis vectors for the null spaces $\color{red}{\mathcal{N} \left( \mathbf{A} \right)}$ and $\color{red}{\mathcal{N} \left( \mathbf{A}^{*} \right)}$.
Fundamental Theorem of Linear Algebra
A matrix $\mathbf{A} \in \mathbb{C}^{m\times n}_{\rho}$ induces four fundamental subspaces. These are range and null spaces for both the column and the row spaces.
$$
\begin{align}
%
\mathbf{C}^{n} =
\color{blue}{\mathcal{R} \left( \mathbf{A}^{*} \right)} \oplus
\color{red}{\mathcal{N} \left( \mathbf{A} \right)} \\
%
\mathbf{C}^{m} =
\color{blue}{\mathcal{R} \left( \mathbf{A} \right)} \oplus
\color{red} {\mathcal{N} \left( \mathbf{A}^{*} \right)}
%
\end{align}
$$
The singular value decomposition provides an orthonormal basis for the four fundamental subspaces.
Singular Value Decomposition
Every nonzero matrix can be expressed as the matrix product
$$
\begin{align}
\mathbf{A} &=
\mathbf{U} \, \Sigma \, \mathbf{V}^{*} \\
%
&=
% U
\left[ \begin{array}{cc}
\color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}}
\end{array} \right]
% Sigma
\left[ \begin{array}{cccc|cc}
\sigma_{1} & 0 & \dots & & & \dots & 0 \\
0 & \sigma_{2} \\
\vdots && \ddots \\
& & & \sigma_{\rho} \\\hline
& & & & 0 & \\
\vdots &&&&&\ddots \\
0 & & & & & & 0 \\
\end{array} \right]
% V
\left[ \begin{array}{c}
\color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} \\
\color{red}{\mathbf{V}_{\mathcal{N}}}^{*}
\end{array} \right] \\
%
& =
% U
\left[ \begin{array}{cccccccc}
\color{blue}{u_{1}} & \dots & \color{blue}{u_{\rho}} & \color{red}{u_{\rho+1}} & \dots & \color{red}{u_{m}}
\end{array} \right]
% Sigma
\left[ \begin{array}{cc}
\mathbf{S}_{\rho\times \rho} & \mathbf{0} \\
\mathbf{0} & \mathbf{0}
\end{array} \right]
% V
\left[ \begin{array}{c}
\color{blue}{v_{1}^{*}} \\
\vdots \\
\color{blue}{v_{\rho}^{*}} \\
\color{red}{v_{\rho+1}^{*}} \\
\vdots \\
\color{red}{v_{n}^{*}}
\end{array} \right]
%
\end{align}
$$
The column vectors of $\mathbf{U}$ are an orthonormal span of $\mathbb{C}^{m}$ (column space), while the column vectors of $\mathbf{V}$ are an orthonormal span of $\mathbb{C}^{n}$ (row space).
The $\rho$ singular values are real and ordered (descending):
$$
\sigma_{1} \ge \sigma_{2} \ge \dots \ge \sigma_{\rho}>0.
$$
These singular values for the diagonal matrix of singular values
$$
\mathbf{S} = \text{diagonal} (\sigma_{1},\sigma_{1},\dots,\sigma_{\rho}) \in\mathbb{R}^{\rho\times\rho}.
$$
The $\mathbf{S}$ matrix is embedded in the sabot matrix $\Sigma\in\mathbb{R}^{m\times n}$ whose shape insures conformability.
Please note that the singular values only correspond to $\color{blue}{range}$ space vectors.
The column vectors form spans for the subspaces:
$$
\begin{align}
% R A
\color{blue}{\mathcal{R} \left( \mathbf{A} \right)} &=
\text{span} \left\{
\color{blue}{u_{1}}, \dots , \color{blue}{u_{\rho}}
\right\} \\
% R A*
\color{blue}{\mathcal{R} \left( \mathbf{A}^{*} \right)} &=
\text{span} \left\{
\color{blue}{v_{1}}, \dots , \color{blue}{v_{\rho}}
\right\} \\
% N A*
\color{red}{\mathcal{N} \left( \mathbf{A}^{*} \right)} &=
\text{span} \left\{
\color{red}{u_{\rho+1}}, \dots , \color{red}{u_{m}}
\right\} \\
% N A
\color{red}{\mathcal{N} \left( \mathbf{A} \right)} &=
\text{span} \left\{
\color{red}{v_{\rho+1}}, \dots , \color{red}{v_{n}}
\right\} \\
%
\end{align}
$$
The conclusion is that the full SVD provides an orthonormal span for not only the two null spaces, but also both range spaces.
Example
Since there is some misunderstanding in the original question, let's show the rough outlines of constructing the SVD.
From your data, we have $2$ singular values. Therefore the rank $\rho = 2$. From this, we know the form of the SVD:
$$
\mathbf{A} =
% U
\left[ \begin{array}{cc}
\color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}}
\end{array} \right]
% Sigma
\left[ \begin{array}{c}
\mathbf{S} \\ \mathbf{0} \\
\end{array} \right]
% V
\left[ \begin{array}{c}
\color{blue}{\mathbf{V}_{\mathcal{R}}}^{*}
\end{array} \right]
$$
That is, the null space $\color{red}{\mathcal{N} \left( \mathbf{A} \right)}$ is trivial.
Construct the matrix $\Sigma$:
Form the product matrix, and compute the eigenvalue spectrum
$$
\lambda \left( \mathbf{A}^{*} \mathbf{A} \right) =
\lambda \left(
\left[
\begin{array}{cc}
7 & 6 \\
6 & 15 \\
\end{array}
\right]
\right) =
\left\{ 11 + 2 \sqrt{13},11-2 \sqrt{13} \right\}
$$
The singular values are the square roots of the ordered eigenvalues:
$$
\sigma_{k} = \lambda_{k},\qquad k = 1, \rho
$$
Construct the diagonal matrix of singular values $\mathbf{S}$ and embed this into the sabot matrix $\Sigma$:
$$
\mathbf{S} =
\left[
\begin{array}{cc}
\sqrt{11 + 2 \sqrt{13}} & 0 \\
0 & \sqrt{11-2 \sqrt{13}}
\end{array}
\right],
\qquad
%
\Sigma =
\left[
\begin{array}{c}
\mathbf{S} \\ \mathbf{0}
\end{array}
\right]
=
\left[
\begin{array}{cc}
\sqrt{11+2 \sqrt{13}} & 0 \\
0 & \sqrt{11-2 \sqrt{13}} \\\hline
0 & 0 \\
0 & 0 \\
\end{array}
\right]
%
$$
Construct the matrix $\mathbf{V}$:
Solve for the eigenvectors of the product matrix $\mathbf{A}^{*} \mathbf{A}$. They are
$$
v_{1} =
\color{blue}{\left[
\begin{array}{c}
\frac{1}{3} \left(-2+\sqrt{13} \right) \\ 1
\end{array}
\right]}, \qquad
v_{2}=
\color{blue}{\left[
\begin{array}{c}
\frac{1}{3} \left(-2-\sqrt{13} \right) \\ 1
\end{array}
\right]}
$$
The normalized form of these vectors will form the columns of $\color{blue}{\mathbf{V}_{\mathcal{R}}}$
$$
\color{blue}{\mathbf{V}_{\mathcal{R}}} =
\left[
\begin{array}{cc}
% v1
\frac{3}{\sqrt{26-4 \sqrt{13}}}
\color{blue}{\left[ \begin{array}{c}
\frac{1}{3} \left(-2+\sqrt{13} \right) \\ 1
\end{array} \right]}
% v2
\frac{3}{\sqrt{26+4 \sqrt{13}}}
\color{blue}{\left[ \begin{array}{c}
\frac{1}{3} \left(-2-\sqrt{13} \right) \\ 1
\end{array} \right]}
\end{array}
%
\right]
$$
Because the null space $\color{red}{\mathcal{N} \left( \mathbf{A} \right)}$ is trivial,
$$
\mathbf{V} = \color{blue}{\mathbf{V}_{\mathcal{R}}}
$$
Construct the matrix $\mathbf{U}$:
The thin SVD is
$$
\begin{align}
\mathbf{A} &=
% U
\color{blue}{\mathbf{U}_{\mathcal{R}}}
% Sigma
\mathbf{S} \,
% V
\color{blue}{\mathbf{V}_{\mathcal{R}}}^{*}
\end{align}
$$
which can be solved as
$$
\begin{align}
\color{blue}{\mathbf{U}_{\mathcal{R}}} &= \mathbf{A} \color{blue}{\mathbf{V}_{\mathcal{R}}} \mathbf{S}^{-1} \\
%%
&=
\left[ \begin{array}{cc}
\frac{1}{\sqrt{182 + 8\sqrt{13}}}
\color{blue}{\left[
\begin{array}{r}
7 + \sqrt{13} \\
4 + \sqrt{13} \\
-5 + \sqrt{13} \\
-1 + 2 \sqrt{13} \\
\end{array}
\right] }
&
%
\frac{1}{\sqrt{182 - 8\sqrt{13}}}
\color{blue}{\left[
\begin{array}{r}
7 - \sqrt{13} \\
4 - \sqrt{13} \\
-5 - \sqrt{13} \\
-1 - 2 \sqrt{13} \\
\end{array}
\right] }
\end{array} \right]
%%
\end{align}
$$
The thin SVD is now complete. If you insist upon the full form of the SVD, we can compute the two missing null space vectors in $\mathbf{U}$ using the Gram-Schmidt process. One such result is
$$
\mathbf{U} =
\left[ \begin{array}{cc}
\frac{1}{\sqrt{182 + 8\sqrt{13}}}
\color{blue}{\left[
\begin{array}{r}
7 + \sqrt{13} \\
4 + \sqrt{13} \\
-5 + \sqrt{13} \\
-1 + 2 \sqrt{13} \\
\end{array}
\right] }
&
%
\frac{1}{\sqrt{182 - 8\sqrt{13}}}
\color{blue}{\left[
\begin{array}{r}
7 - \sqrt{13} \\
4 - \sqrt{13} \\
-5 - \sqrt{13} \\
-1 - 2 \sqrt{13} \\
\end{array}
\right] }
&
\frac{1}{\sqrt{26}}
\color{red}{\left[
\begin{array}{r}
3 \\
-4 \\
1 \\
0 \\
\end{array}
\right] }
&
%
\frac{1}{\sqrt{35}}
\color{red}{\left[
\begin{array}{r}
3 \\
-5 \\
0 \\
1 \\
\end{array}
\right] }
%
\end{array} \right]
$$
Conclusion
The singular values only interact with the first two range space vectors.
$$
\begin{align}
\mathbf{A} &=
% U
\left[ \begin{array}{cc}
\color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}}
\end{array} \right]
% Sigma
\left[ \begin{array}{c}
\mathbf{S} \\
\mathbf{0} \\
\end{array} \right]
% V
\color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} \\
&=
% U
\left[
\begin{array}{cccc}
\color{blue}{\star} & \color{blue}{\star} & \color{red}{\star} & \color{red}{\star} \\
\color{blue}{\star} & \color{blue}{\star} & \color{red}{\star} & \color{red}{\star} \\
\color{blue}{\star} & \color{blue}{\star} & \color{red}{\star} & \color{red}{\star} \\
\color{blue}{\star} & \color{blue}{\star} & \color{red}{\star} & \color{red}{\star} \\
\end{array}
\right]
% S
\left[
\begin{array}{cc}
\sqrt{11+2 \sqrt{13}} & 0 \\
0 & \sqrt{11-2 \sqrt{13}} \\\hline
0 & 0 \\
0 & 0 \\
\end{array}
\right]
% V
\left[
\begin{array}{cc}
\color{blue}{\star} & \color{blue}{\star} \\
\color{blue}{\star} & \color{blue}{\star} \\
\end{array}
\right]
\end{align}
$$
Best Answer
You're right that the columns of $U$ and $V$ are the eigenvectors of $AA^T$ and $A^TA$ (repectively). However, that information alone does not completely determine the SVD.
Analogously: in an eigendecomposition $A = PDP^{-1}$, the columns of $P$ are eigenvectors and the entries of $D$ are eigenvalues. Nevertheless, $$ \pmatrix{0&1\\1&0}\pmatrix{1\\&2}\pmatrix{0&1\\1&0}^{-1} \neq \pmatrix{1&0\\0&1}\pmatrix{1\\&2}\pmatrix{1&0\\0&1}^{-1} $$ Once we've found a valid choice of $V$ (with eigenvectors in order corresponding to the eigenvalues of $A^TA$ in descending order), it remains to solve for a satisfactory $U$. If $A$ is invertible, then $$ A = U DV^T \implies U = AVD^{-1} $$ even if $A$ is not invertible, some of the columns of $U$ will be determined by our choice of $V$.
To put it precisely: the $i$th column of $U$ is given by $u_i = \frac 1{\sigma_i}Av_i$ whenever $\sigma_i$ is non-zero. To see that these columns are orthonormal, note that $$ u_i^Tu_j = \frac{1}{\sigma_i\sigma_j} v_i^TA^TAv_j $$