First, you slightly messed-up the order of the eigenvectors. For $MM^T$ is
$u_1 =(0,1,0)$, $u_2 = c(1,0,0)$ and $u_3 = c(0,0,1)$ correspond to $9$, $4$ and $0$. For $M^T M$ the result is immediate with $v_1 = c(0,1)$ and $v_2 = (-1,0)$ correspond to $9$ and $4$.
Second, some more comments: It seems like you over-killed with the computations. Note that $MM^T$ is a diagonal matrix, hence its eigenvalues are the diagonal entries with orthogonal (normalized) eigenvectors. The non-zero eigenvalues of $M^TM$ are the same, while the eigenvectors (due to symmetry) are also the same up-to a sign. Namely, once you see $MM^T$ everything else is follows immediately without Gram-Schimdt, QR and so on.
Regarding the choice of the signs - it is slightly more subtle, as the singular vectors are uniquely determined only up to a sign. As such you have to be careful with the sign choice in order to "reconstruct" the original $M$. Please see here
Calculating SVD by hand: resolving sign ambiguities in the range vectors.
the last answer about the choice of a sign.
UL factorization from LU factorization with permutation matrix
According to the preprint Necessary And Sufficient Conditions For Existence of the LU Factorization of an Arbitrary Matrix, a both necessary and sufficient condition for the existence of an LU factorization of a square $n\times n$ matrix $A$ is given by:
$$(1)\,\text{$A$ has LU factorization} \iff \forall k\colon\, \operatorname{rk}(A[\colon k,\, \colon k]) \ge \operatorname{rk}(A[\colon k,\, \colon n]) + \operatorname{rk}(A[\colon n,\, \colon k]) -k$$
Where $A[\colon i, \colon j]$ is the matrix consisting of the first $i$ rows and $j$ columns of $A$. We shall see that:
$$(2)\,\text{$A$ has UL factorization} \iff \forall k\colon\, \operatorname{rk}(A[k\colon,\, k\colon]) \ge \operatorname{rk}(A[k\colon,\, n\colon]) + \operatorname{rk}(A[n\colon,\, k\colon]) -k$$
Where $A[i\colon, j\colon]$ is the matrix consisting of the last $i$ rows and $j$ columns of $A$.
Now, without further notice, given any $k\times l$ matrix $X$ let $\overline X = P_k^T X P_l$, be the flip of $X$, where $(P_n)_{ij}=\delta_{i, n-i}$ is the exchange matrix of size $n$. We will show:
$$(3)\qquad \text{$A$ has UL factorization} \iff \text{$\overline A$ has LU factorization}$$
So the existence of an LU factorization of $B=\overline A$ is in fact equivalent to $A$ having a UL factorization.
Proof of (2): Note that the way the preprint proves (1) is by splitting $A$ into a block matrix at index $k$ for every $k=1 \ldots n$
$$
A =\left[\begin{array}{cc}
A_{11} & A_{12} \\
A_{21} & A_{22}
\end{array}\right]
=\left[\begin{array}{ll}
L_{11}U_{11} & L_{11}U_{12} \\
L_{21}U_{12} & L_{21}U_{12} + L_{22} U_{22}
\end{array}\right]
=\left[\begin{array}{cc}
L_{11} & 0 \\
L_{21} & L_{22}
\end{array}\right]\left[\begin{array}{cc}
U_{11} & U_{12} \\
0 & U_{22}
\end{array}\right]
=LU
$$
and then apply rank inequalities of the type $\operatorname{rk}(XY) \ge \operatorname{rk}(X) + \operatorname{rk}(Y) -k$ to the blocks.
Note that $A=LU$ is equivalent to $P^TAP = (P^T L P)(P^T U P)$, i.e. $\overline A = \overline L \cdot \overline U$, hence
$$
\overline A =\left[\begin{array}{cc}
\overline A_{22} & \overline A_{21} \\
\overline A_{12} & \overline A_{11}
\end{array}\right]
=\left[\begin{array}{ll}
\overline L_{21}\overline U_{12} + \overline L_{22} \overline U_{22} & \overline L_{21}\overline U_{12} \\
\overline L_{11}\overline U_{12} & \overline L_{11}\overline U_{11}
\end{array}\right]
=\left[\begin{array}{cc}
\overline L_{22} & \overline L_{21} \\
0 & \overline L_{11}
\end{array}\right]\left[\begin{array}{cc}
\overline U_{22} & 0 \\
\overline U_{12} & \overline U_{11}
\end{array}\right]
=\overline L \overline U
$$
which is a block UL factorization! So we can in principle apply the exact same argumentation of the proof of (1) given in the preprint "module flipping" to conclude (2).
Proof of (3): This now follows immediately because conditions (1) and (2) are equivalent modulo flipping, since for any matrix $X$ holds $\operatorname{rk}(X) = \operatorname{rk}(\overline X)$, and the submatrices in the block decomposition of $\overline A$ are precisely the flips of the submatrices of the block decomposition of $A$ since
$$\begin{aligned}
\forall ij:\; (P_n^T AP_n)[\colon i,\, \colon j] = P_{i}^TA[n-i\colon,\, n-j\colon] P_{j}
\end{aligned}$$
I.e. $\overline{A}[\colon i,\, \colon j] = \overline{A[n-i\colon,\, n-j\colon]}$, which allows us to conclude
$$\begin{aligned}
&\text{$A$ has LU factorization}
\\ &\iff \forall k\colon\, \operatorname{rk}(A[\colon k,\, \colon k]) \ge \operatorname{rk}(A[\colon k,\, \colon n]) + \operatorname{rk}(A[\colon n,\, \colon k]) -k
\\&\iff \forall k\colon\, \operatorname{rk}(\overline{A[\colon k,\, \colon k]}) \ge \operatorname{rk}(\overline{A[\colon k,\, \colon n]}) + \operatorname{rk}(\overline{A[\colon n,\, \colon k]}) -k
\\&\iff \forall k\colon\, \operatorname{rk}(\overline{A}[n-k\colon,\, n-k\colon ]) \ge \operatorname{rk}(\overline{A}[n-k\colon,\, n\colon]) + \operatorname{rk}(\overline{A}[n\colon,\, n-k\colon]) -k
\\&\iff \forall k\colon\, \operatorname{rk}(\overline{A}[k\colon,\, k\colon]) \ge \operatorname{rk}(\overline{A}[k\colon,\, n\colon]) + \operatorname{rk}(\overline{A}[n\colon,\, k\colon]) -k
\\ &\text{$\overline A$ has UL factorization}
\end{aligned}$$
Best Answer
You can flip the sign of any column of $Q$ (and make corresponding changes to the same row of $R$) and keep $A=QR$ and $Q$ orthogonal. In the example you've given, you can easily see that the two solutions are equivalent except for this sign change on some columns of $Q$ and corresponding rows of $R$.
In that sense, the QR factorization is not unique. Most codes for computing the QR factorization don't worry about this and let the diagonal elements of $R$ have arbitrary signs. Some codes will arrange the computation so that $R$ has non-negative entries on the diagonal.
If you want to get such a QR factorization after using a routine that doesn't care about the sign of the diagonal entries of $R$, you can easily correct it- for each row $i$ in which $R_{i,i}<0$, flip the sign of row $i$ of $R$ and flip the sign of column $i$ of $Q$.
If $A$ has full column rank, and if we follow the convention that diagonal entries of $R$ are constrained to be positive, then the QR factorization is unique- it's a good exercise to prove this.