Ok, suppose that $A = U \Sigma V^{T}$ . First of all consider how the SVD is calculated. The construction of the SVD uses the following idea
We get the left singular vectors $U$ by computing the covariance matrix
$$ AA^{T} = U \Sigma V^{T}( U \Sigma V^{T})^{T} = U \Sigma V^{T} (V \Sigma^{T} U^{T}) $$
now note that $V^{T}V = I$ so we have
$$ U \Sigma V^{T} (V \Sigma^{T} U^{T}) = U \Sigma \Sigma^{T} U^{T} $$
note that $\Sigma$ is a diagonal matrix and $ \sigma_{jj}^{T} = \sigma_{jj}$ so
$$ U \Sigma \Sigma^{T} U^{T} = U \Sigma \Sigma U^{T}$$
now the singular values $\sigma_{i}^{2} = \lambda_{i}$ so you can write
$$ U \Lambda^{\frac{1}{2}} \Lambda^{\frac{1}{2}} U^{T} = U \Lambda U^{T} $$
Similarly, the right singular vectors $V$ are given as
$$ A^{T}A = (U \Sigma V^{T})^T (U \Sigma V^{T}) = V \Lambda V^{T} $$
Note that you can write this like
$$ V^{T} A^{T}A V = \begin{pmatrix} v_{1}^{T} \\ v_{2}^{T} \\ \vdots \\ v_{m}^{T} \end{pmatrix} \left( A^{T}A v_{1}, A^{T}A v_{2} , \cdots A^{T} A v_{m} \right) = \begin{pmatrix} v_{1}^{T} \\ v_{2}^{T} \\ \vdots \\ v_{m}^{T} \end{pmatrix} \left( \lambda_{1} v_{1}, A^{T}A v_{2} , \cdots A^{T} A v_{m} \right) = \begin{pmatrix} \lambda_{1} & \textrm{z}^{*} \\ 0 & \textrm{B} \end{pmatrix} $$
where by induction we have $ B = \hat{V}{S} \hat{V}^{T}$
$$ V^{T} A^{T}A V = \begin{pmatrix} \lambda_{1} & z^{*} \\ 0 & \hat{V}{S} \hat{V}^{T}\end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & \hat{V} \end{pmatrix} \begin{pmatrix} \lambda & z^{*} \\ 0 & S \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & \hat{V} \end{pmatrix}^{T} $$
We use the same idea for $\tilde{A} = \tilde{U}^{T} A \tilde{V}$
$$ \tilde{U}^{T} A \tilde{V} = \begin{pmatrix} \tilde{u}_{1}^{T} \\ \tilde{u}_{2}^{T} \\ \vdots \\ \tilde{u}_{m}^{T} \end{pmatrix} \left(A \tilde{v}_{1} , A \tilde{v}_{2} , \cdots A \tilde{v}_{m} \right) \\ = \begin{pmatrix} \tilde{u}_{1}^{T} \\ \tilde{u}_{2}^{T} \\ \vdots \\ \tilde{u}_{m}^{T} \end{pmatrix} \left(\sigma \tilde{v}_{1} , A \tilde{v}_{2} , \cdots A \tilde{v}_{m} \right) = \begin{pmatrix} \sigma_{1} & z^{*} \\ 0 & \hat{A} \end{pmatrix} $$
with $\hat{A} = \hat{U} \hat{\Sigma} \hat{V}^{T}$ we have
$$ \begin{pmatrix} 1 & 0 \\ 0 & \hat{U} \end{pmatrix} \begin{pmatrix} \sigma_{1} & z^{*} \\ 0 & \hat{A} \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & \hat{V} \end{pmatrix}^{T} $$
The point is to continue along the lower part of the diagonal and prove $z^{*}$ is equal to $0$.
$$ \bigg\| \begin{bmatrix} \sigma_{1} & z^{*} \\ 0 & \hat{A} \end{bmatrix} \begin{bmatrix} \sigma_{1} \\ z \end{bmatrix} \bigg\|_{2} \geq \sigma_{1}^{2} + z^{*} z = \left( \sigma_{1}^{2} + z^{*}z\right)^{\frac{1}{2}} \bigg\| \begin{bmatrix} \sigma_{1} \\ z \end{bmatrix} \bigg\|_{2} $$
Now, note we're working with $\| \tilde{A}\| = \| \tilde{U}^{T} A \tilde{V} \|$ and $\| \tilde{A}\|_{2} \geq \left( \sigma_{1} + z^{*}z \right)^{\frac{1}{2}} $. Since $\| \tilde{A}\| = \| \tilde{U}^{T} A \tilde{V} \|$ we know that $\|\tilde{A}\| = \|A \| = \sigma_{1}$ which forces $z^{*} = 0$
Once you have done this we then have
$$ A = \tilde{U} \tilde{A} \tilde{V}^{T} = \begin{pmatrix} 1 & 0 \\ 0 & \hat{U} \end{pmatrix} \begin{pmatrix} \sigma_{1} & 0 \\ 0 & \hat{A} \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & \hat{V} \end{pmatrix}^{T} $$
Since this gives the form for a general matrix $A \in \mathbb{R}^{n \times m}$ and we have that $\hat{A} \in \mathbb{R}^{ (n -1) \times (m-1)}$ we know it has a SVD just like that...
I'm pretty sure I've messed up the $\hat{A}, \tilde{A}$ somewhere along here..
Best Answer
If $y\in \text{Col}(UV^T)$ then $y=UV^Tx$ for some $x \in \mathbb{R}^n$ and so $$y=UV^Tx=U\big(V^Tx\big)\in\text{Col}(U)$$ On the other hand, if $y\in \text{Col}(U),$ then $y=Ux$ for some $x\in \mathbb{R}^r$ and so $$y=Ux=UI_{r}x=UV^T(Vx)\in \text{Col}(UV^T)$$ This shows $\text{Col}(U)=\text{Col}(UV^T)$ and since $\text{rank}(U)=r$ (columns of $U$ are orthogonal) we must also have $\text{rank}(UV^T)$ equaling $r$ as well.