Summary
Computing the full form of the singular value decomposition (SVD) will generate a set of orthonormal basis vectors for the null spaces $\color{red}{\mathcal{N} \left( \mathbf{A} \right)}$ and $\color{red}{\mathcal{N} \left( \mathbf{A}^{*} \right)}$.
Fundamental Theorem of Linear Algebra
A matrix $\mathbf{A} \in \mathbb{C}^{m\times n}_{\rho}$ induces four fundamental subspaces. These are range and null spaces for both the column and the row spaces.
$$
\begin{align}
%
\mathbf{C}^{n} =
\color{blue}{\mathcal{R} \left( \mathbf{A}^{*} \right)} \oplus
\color{red}{\mathcal{N} \left( \mathbf{A} \right)} \\
%
\mathbf{C}^{m} =
\color{blue}{\mathcal{R} \left( \mathbf{A} \right)} \oplus
\color{red} {\mathcal{N} \left( \mathbf{A}^{*} \right)}
%
\end{align}
$$
The singular value decomposition provides an orthonormal basis for the four fundamental subspaces.
Singular Value Decomposition
Every nonzero matrix can be expressed as the matrix product
$$
\begin{align}
\mathbf{A} &=
\mathbf{U} \, \Sigma \, \mathbf{V}^{*} \\
%
&=
% U
\left[ \begin{array}{cc}
\color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}}
\end{array} \right]
% Sigma
\left[ \begin{array}{cccc|cc}
\sigma_{1} & 0 & \dots & & & \dots & 0 \\
0 & \sigma_{2} \\
\vdots && \ddots \\
& & & \sigma_{\rho} \\\hline
& & & & 0 & \\
\vdots &&&&&\ddots \\
0 & & & & & & 0 \\
\end{array} \right]
% V
\left[ \begin{array}{c}
\color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} \\
\color{red}{\mathbf{V}_{\mathcal{N}}}^{*}
\end{array} \right] \\
%
& =
% U
\left[ \begin{array}{cccccccc}
\color{blue}{u_{1}} & \dots & \color{blue}{u_{\rho}} & \color{red}{u_{\rho+1}} & \dots & \color{red}{u_{m}}
\end{array} \right]
% Sigma
\left[ \begin{array}{cc}
\mathbf{S}_{\rho\times \rho} & \mathbf{0} \\
\mathbf{0} & \mathbf{0}
\end{array} \right]
% V
\left[ \begin{array}{c}
\color{blue}{v_{1}^{*}} \\
\vdots \\
\color{blue}{v_{\rho}^{*}} \\
\color{red}{v_{\rho+1}^{*}} \\
\vdots \\
\color{red}{v_{n}^{*}}
\end{array} \right]
%
\end{align}
$$
The column vectors of $\mathbf{U}$ are an orthonormal span of $\mathbb{C}^{m}$ (column space), while the column vectors of $\mathbf{V}$ are an orthonormal span of $\mathbb{C}^{n}$ (row space).
The $\rho$ singular values are real and ordered (descending):
$$
\sigma_{1} \ge \sigma_{2} \ge \dots \ge \sigma_{\rho}>0.
$$
These singular values for the diagonal matrix of singular values
$$
\mathbf{S} = \text{diagonal} (\sigma_{1},\sigma_{1},\dots,\sigma_{\rho}) \in\mathbb{R}^{\rho\times\rho}.
$$
The $\mathbf{S}$ matrix is embedded in the sabot matrix $\Sigma\in\mathbb{R}^{m\times n}$ whose shape insures conformability.
Please note that the singular values only correspond to $\color{blue}{range}$ space vectors.
The column vectors form spans for the subspaces:
$$
\begin{align}
% R A
\color{blue}{\mathcal{R} \left( \mathbf{A} \right)} &=
\text{span} \left\{
\color{blue}{u_{1}}, \dots , \color{blue}{u_{\rho}}
\right\} \\
% R A*
\color{blue}{\mathcal{R} \left( \mathbf{A}^{*} \right)} &=
\text{span} \left\{
\color{blue}{v_{1}}, \dots , \color{blue}{v_{\rho}}
\right\} \\
% N A*
\color{red}{\mathcal{N} \left( \mathbf{A}^{*} \right)} &=
\text{span} \left\{
\color{red}{u_{\rho+1}}, \dots , \color{red}{u_{m}}
\right\} \\
% N A
\color{red}{\mathcal{N} \left( \mathbf{A} \right)} &=
\text{span} \left\{
\color{red}{v_{\rho+1}}, \dots , \color{red}{v_{n}}
\right\} \\
%
\end{align}
$$
The conclusion is that the full SVD provides an orthonormal span for not only the two null spaces, but also both range spaces.
Example
Since there is some misunderstanding in the original question, let's show the rough outlines of constructing the SVD.
From your data, we have $2$ singular values. Therefore the rank $\rho = 2$. From this, we know the form of the SVD:
$$
\mathbf{A} =
% U
\left[ \begin{array}{cc}
\color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}}
\end{array} \right]
% Sigma
\left[ \begin{array}{c}
\mathbf{S} \\ \mathbf{0} \\
\end{array} \right]
% V
\left[ \begin{array}{c}
\color{blue}{\mathbf{V}_{\mathcal{R}}}^{*}
\end{array} \right]
$$
That is, the null space $\color{red}{\mathcal{N} \left( \mathbf{A} \right)}$ is trivial.
Construct the matrix $\Sigma$:
Form the product matrix, and compute the eigenvalue spectrum
$$
\lambda \left( \mathbf{A}^{*} \mathbf{A} \right) =
\lambda \left(
\left[
\begin{array}{cc}
7 & 6 \\
6 & 15 \\
\end{array}
\right]
\right) =
\left\{ 11 + 2 \sqrt{13},11-2 \sqrt{13} \right\}
$$
The singular values are the square roots of the ordered eigenvalues:
$$
\sigma_{k} = \lambda_{k},\qquad k = 1, \rho
$$
Construct the diagonal matrix of singular values $\mathbf{S}$ and embed this into the sabot matrix $\Sigma$:
$$
\mathbf{S} =
\left[
\begin{array}{cc}
\sqrt{11 + 2 \sqrt{13}} & 0 \\
0 & \sqrt{11-2 \sqrt{13}}
\end{array}
\right],
\qquad
%
\Sigma =
\left[
\begin{array}{c}
\mathbf{S} \\ \mathbf{0}
\end{array}
\right]
=
\left[
\begin{array}{cc}
\sqrt{11+2 \sqrt{13}} & 0 \\
0 & \sqrt{11-2 \sqrt{13}} \\\hline
0 & 0 \\
0 & 0 \\
\end{array}
\right]
%
$$
Construct the matrix $\mathbf{V}$:
Solve for the eigenvectors of the product matrix $\mathbf{A}^{*} \mathbf{A}$. They are
$$
v_{1} =
\color{blue}{\left[
\begin{array}{c}
\frac{1}{3} \left(-2+\sqrt{13} \right) \\ 1
\end{array}
\right]}, \qquad
v_{2}=
\color{blue}{\left[
\begin{array}{c}
\frac{1}{3} \left(-2-\sqrt{13} \right) \\ 1
\end{array}
\right]}
$$
The normalized form of these vectors will form the columns of $\color{blue}{\mathbf{V}_{\mathcal{R}}}$
$$
\color{blue}{\mathbf{V}_{\mathcal{R}}} =
\left[
\begin{array}{cc}
% v1
\frac{3}{\sqrt{26-4 \sqrt{13}}}
\color{blue}{\left[ \begin{array}{c}
\frac{1}{3} \left(-2+\sqrt{13} \right) \\ 1
\end{array} \right]}
% v2
\frac{3}{\sqrt{26+4 \sqrt{13}}}
\color{blue}{\left[ \begin{array}{c}
\frac{1}{3} \left(-2-\sqrt{13} \right) \\ 1
\end{array} \right]}
\end{array}
%
\right]
$$
Because the null space $\color{red}{\mathcal{N} \left( \mathbf{A} \right)}$ is trivial,
$$
\mathbf{V} = \color{blue}{\mathbf{V}_{\mathcal{R}}}
$$
Construct the matrix $\mathbf{U}$:
The thin SVD is
$$
\begin{align}
\mathbf{A} &=
% U
\color{blue}{\mathbf{U}_{\mathcal{R}}}
% Sigma
\mathbf{S} \,
% V
\color{blue}{\mathbf{V}_{\mathcal{R}}}^{*}
\end{align}
$$
which can be solved as
$$
\begin{align}
\color{blue}{\mathbf{U}_{\mathcal{R}}} &= \mathbf{A} \color{blue}{\mathbf{V}_{\mathcal{R}}} \mathbf{S}^{-1} \\
%%
&=
\left[ \begin{array}{cc}
\frac{1}{\sqrt{182 + 8\sqrt{13}}}
\color{blue}{\left[
\begin{array}{r}
7 + \sqrt{13} \\
4 + \sqrt{13} \\
-5 + \sqrt{13} \\
-1 + 2 \sqrt{13} \\
\end{array}
\right] }
&
%
\frac{1}{\sqrt{182 - 8\sqrt{13}}}
\color{blue}{\left[
\begin{array}{r}
7 - \sqrt{13} \\
4 - \sqrt{13} \\
-5 - \sqrt{13} \\
-1 - 2 \sqrt{13} \\
\end{array}
\right] }
\end{array} \right]
%%
\end{align}
$$
The thin SVD is now complete. If you insist upon the full form of the SVD, we can compute the two missing null space vectors in $\mathbf{U}$ using the Gram-Schmidt process. One such result is
$$
\mathbf{U} =
\left[ \begin{array}{cc}
\frac{1}{\sqrt{182 + 8\sqrt{13}}}
\color{blue}{\left[
\begin{array}{r}
7 + \sqrt{13} \\
4 + \sqrt{13} \\
-5 + \sqrt{13} \\
-1 + 2 \sqrt{13} \\
\end{array}
\right] }
&
%
\frac{1}{\sqrt{182 - 8\sqrt{13}}}
\color{blue}{\left[
\begin{array}{r}
7 - \sqrt{13} \\
4 - \sqrt{13} \\
-5 - \sqrt{13} \\
-1 - 2 \sqrt{13} \\
\end{array}
\right] }
&
\frac{1}{\sqrt{26}}
\color{red}{\left[
\begin{array}{r}
3 \\
-4 \\
1 \\
0 \\
\end{array}
\right] }
&
%
\frac{1}{\sqrt{35}}
\color{red}{\left[
\begin{array}{r}
3 \\
-5 \\
0 \\
1 \\
\end{array}
\right] }
%
\end{array} \right]
$$
Conclusion
The singular values only interact with the first two range space vectors.
$$
\begin{align}
\mathbf{A} &=
% U
\left[ \begin{array}{cc}
\color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}}
\end{array} \right]
% Sigma
\left[ \begin{array}{c}
\mathbf{S} \\
\mathbf{0} \\
\end{array} \right]
% V
\color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} \\
&=
% U
\left[
\begin{array}{cccc}
\color{blue}{\star} & \color{blue}{\star} & \color{red}{\star} & \color{red}{\star} \\
\color{blue}{\star} & \color{blue}{\star} & \color{red}{\star} & \color{red}{\star} \\
\color{blue}{\star} & \color{blue}{\star} & \color{red}{\star} & \color{red}{\star} \\
\color{blue}{\star} & \color{blue}{\star} & \color{red}{\star} & \color{red}{\star} \\
\end{array}
\right]
% S
\left[
\begin{array}{cc}
\sqrt{11+2 \sqrt{13}} & 0 \\
0 & \sqrt{11-2 \sqrt{13}} \\\hline
0 & 0 \\
0 & 0 \\
\end{array}
\right]
% V
\left[
\begin{array}{cc}
\color{blue}{\star} & \color{blue}{\star} \\
\color{blue}{\star} & \color{blue}{\star} \\
\end{array}
\right]
\end{align}
$$
left singular vectors is by definition a unit vector $v_i$ such that :
$$
Av_i=s_i w_i \qquad A^tw_i=s_i v_i
$$
then :
$$
AA^tw_i=s_i A v_i= s_i^2 w_i
$$
so $s_i^2$ are the eigenvalues of the hermitian matrix $AA^t$ with corresponding eigenvectors $w_i$ so $w_i$ will be orthonormal, in fact it suffices to show that $w_i$ are orthogonal, let $s_i\neq s_j$ :
$$
s_i^2\langle w_i , w_j \rangle =\langle s_i^2 w_i , w_j \rangle = \langle AA^tw_i , w_j \rangle=\langle w_i , AA^tw_j \rangle=\langle w_i ,s_j^2 ww_j \rangle=s_j^2\langle w_i , w_j \rangle
$$
so $\langle w_i , w_j\rangle =0$ this implies that
$$
\langle Av_i , Av_j\rangle=\langle s_i w_i , s_jw_j\rangle=s_i s_j\langle w_i , w_j\rangle=0
$$
for the second question of rank we will use some simple fact about rank, which you can see them here :
$$
rank(Ab)\leq \min(rank(A),rank(B))\\
rank(A+B)\leq rank(A)+rank(B)
$$
so we know that the matrix $v_iv_i^t$ is of rank one (because $(v_iv_i^t)x=\langle v_i , x \rangle v_i$ forall $x$) and then for all $i$ the matrix $Av_i v_i^t$ is of rank one (if we assume surly that $A\neq 0$), so
$$
rank(A_k)=rank(\sum_{i=1}^k A v_i v_i^t ) \leq \sum_{i=1}^k rank(A v_i v_i^t)=k
$$
to verify the equality we can use this question (because the answer of the question remain true for $n\times d$ matrix) and to see that
$$
Ran(Av_iv_i^t)=\mathbb{R}(Av_i)\\
Ran((Av_iv_i^t)^t)=Ran(v_iv_i^t A^t )=\mathbb{R} v_i
$$
(because $Ran(v_i v_i^t)=\mathbb{R} v_i $)
so for $i\neq j$ :
$$
Ran(Av_iv_i^t)\cap Ran(Av_jv_j^t)= \mathbb{R}(Av_i) \cap \mathbb{R}(Av_j)=\{0\} \;\textrm{Orthogonality from the first question }\\
Ran((Av_iv_i^t)^t)\cap Ran((Av_jv_j^t)^t)=\mathbb{R} v_i \cap \mathbb{R} v_j =\{0\} \; \textrm{Orthogonality with the same argument }
$$
so $Rank(A_k)=k$.
Best Answer
Here's a procedure you can use to find the SVD of $A$. It's quite "quick and dirty" in the sense that I've just shown the procedure and not explained anything, but hopefully, it'll help as a guide for when you learn the general procedure from someone much more qualified than I :)
Let $A=\begin{bmatrix} 1 & 1 \\ 0 & 0\end{bmatrix}$ with SVD $A=U\Sigma V^T$.
First, find the eigenvalues of $A^TA$. So we have $$\begin{bmatrix} 1 & 0 \\ 1 & 0\end{bmatrix} \begin{bmatrix} 1 & 1 \\ 0 & 0\end{bmatrix}=\begin{bmatrix} 1 & 1 \\ 1 & 1\end{bmatrix}.$$ With eigenvalues $\lambda_1=2,\lambda_2=0$ both with algebraic multiplicity $1$. The singular values of $A$ are therefore $\sigma_1=\sqrt2$ and $\sigma_2=0$.
Next, find the eigenspaces corresponding to each eigenvalue. For $\lambda_1=2$, we have $$ E_{\lambda_1}=\ker\left(\begin{bmatrix} -1 & 1 \\ 1 & -1\end{bmatrix}\right)=\mathrm{span}\left\{\begin{bmatrix} 1 \\ 1\end{bmatrix}\right\}. $$ And for $\lambda_2=0$ we have $$ E_{\lambda_2}=\ker\left(\begin{bmatrix} 1 & 1 \\ 1 & 1\end{bmatrix}\right)=\mathrm{span}\left\{\begin{bmatrix} -1 \\ 1\end{bmatrix}\right\}. $$ Now take the union of the spanning sets for the eigenspaces and make the set orthonormal. The Gram–Schmidt process is the way to go. We obtain$$\left\{\begin{bmatrix} \frac{\sqrt2}{2} \\ \frac{\sqrt2}{2}\end{bmatrix},\begin{bmatrix} -\frac{\sqrt2}{2} \\ \frac{\sqrt2}{2}\end{bmatrix}\right\}.$$ Name these vectors $v_1$ and $v_2$, respectively. Then, define $$ u_1=\frac{1}{\sigma_1}Av_1=\begin{bmatrix} 1 \\ 0\end{bmatrix}. $$ We would like the define $u_2$ in the same way $u_2=\frac{1}{\sigma_2}Av_2$, but $\sigma_2=0$, so instead we set $u_2$ to be a matrix orthonormal to $u_1$. Clearly, $$ u_2=\begin{bmatrix} 0 \\ 1\end{bmatrix}. $$
Finally, we obtain following the matrices: $$ \begin{align} V&=\begin{bmatrix} v_1 & v_2\end{bmatrix}=\begin{bmatrix} \frac{\sqrt2}{2} & -\frac{\sqrt2}{2}\\ \frac{\sqrt2}{2} & \frac{\sqrt2}{2}\end{bmatrix}\\ U&=\begin{bmatrix} u_1 & u_2\end{bmatrix}=\begin{bmatrix} 1 & 0 \\ 0 & 1\end{bmatrix}\\ \Sigma&=\begin{bmatrix} \sigma_1 & 0 \\ 0 & \sigma_2\end{bmatrix}=\begin{bmatrix} \sqrt2 & 0 \\ 0 & 0\end{bmatrix}. \end{align} $$
Hope this helps!