The question is not posed completely clearly, but I think that something close to what the questioner wants should follow quickly from the singular value decomposition, which states that any real matrix $A$ can be written in the form
$$
A=UDV,
$$
where $U$ and $V$ are square real orthogonal matrices and $D$ is a (possibly rectangular) diagonal matrix with nonnegative entries on the diagonal. Since $U$ and $V$ are orthogonal they are products of rotations and reflections, while $D$ can be thought of as a product of projections and scalings.
For example, if
$$
A=\left(\begin{array}{cc}1&2x\\0&1\end{array}\right),
$$
then
$$
A=
\left(\begin{array}{cc}\cos \phi&-\sin\phi\\\sin\phi&\cos\phi\end{array}\right)
\left(\begin{array}{cc}\sqrt{x^2+1}-x&0\\0&\sqrt{x^2+1}+x\end{array}\right)
\left(\begin{array}{cc}\cos\theta&-\sin\theta\\\sin\theta&\cos\theta\end{array}\right),
$$
where
$$
\phi=-\frac{\pi}{4}-\frac{1}{2}\arctan x, \qquad
\theta=\frac{\pi}{4}-\frac{1}{2}\arctan x.
$$
In reply to the comments below: Interpreting a diagonal matrix with positive entries along the diagonal as a scaling relies on allowing the scaling to be non-uniform, i.e., allowing it to scale different axes by different amounts. If the scaling matrices are restricted to be uniform, then, by using examples like the one above, you can write a square diagonal matrix with positive entries as a product of orthogonal matrices, shears, and a uniform scaling.
Rotation about a coordinate axis is a linear transformation, so it has a correponding matrix. For instance, there is a matrix $A$ such that $A(x,y,z)^T$ is the vector that you get when you rotate $(x,y,z)^T$ $45^\circ$ about the $z$-axis. To find the matrix of a linear transformation, just work out where the transformation sends the basis vectors $(1,0,0)^T,(0,1,0)^T$, and $(0,0,1)^T$, and make those images the columns of the matrix.
For instance, the matrix that performs a $45^\circ$ rotation about the $z$-axis is $$A=\begin{bmatrix}1/\sqrt2&-1/\sqrt2&0\\1/\sqrt2&1/\sqrt2&0\\0&0&1\end{bmatrix}\;,$$ because this rotation takes $(1,0,0)^T$ to $(1/\sqrt2,1/\sqrt2,0)^T$, $(0,1,0)^T$ to $(-1/\sqrt2,1/\sqrt2,0)^T$, and $(0,0,1)^T$ to $(0,0,1)^T$.
Similarly, the matrix that performs a $90^\circ$ rotation about the $x$-axis is
$$B=\begin{bmatrix}1&0&0\\0&0&-1\\0&1&0\end{bmatrix}\;,$$ because that rotation takes the basis vectors $(1,0,0)^T,(0,1,0)^T$, and $(0,0,1)^T$ to $(1,0,0)^T$, $(0,0,1)^T$, and $(0,-1,0)^T$, respectively.
Scaling by $s_x,s_y,s_z$ means transforming $(x,y,z)^T$ to $(s_xx,s_yy,s_zz)^T$; this operation is performed by the matrix $$\begin{bmatrix}s_x&0&0\\0&s_y&0\\0&0&s_z\end{bmatrix}\;;$$ if you don’t immediately see why, calculate the vector $$\begin{bmatrix}s_x&0&0\\0&s_y&0\\0&0&s_z\end{bmatrix}\begin{bmatrix}x\\y\\z\end{bmatrix}\;.$$
Translation by by $(2,1,-1)^T$ is not a linear transformation; to translate $(x,y,z)^T$ by $(2,1,-1)^T$, just add to get $(x+2,y+1,z-1)^T$.
Thus, if you start with a vector $(x,y,z)^T$ and perform the operations described in (1), you get the vector
$$B\left(SA\begin{bmatrix}x\\y\\z\end{bmatrix}+\begin{bmatrix}2\\1\\-1\end{bmatrix}\right)\;,\tag{1}$$
where $S$ is the appropriate scaling matrix.
Now you can simply substitute the vectors $t_0,t_1,t_2$, and $t_3$ for $(x,y,z)^T$, calculate their images under this sequence of transformations, and see whether these images are $u_0,u_1,u_2$, and $u_3$, respectively.
At worst you can do the same thing with each of (2)-(5); this is tedious, but it works. If you’ve already learned something about how rotations, scaling, and translations interact with one another, you may be able to shorten the work considerably. For example, if you know that rotation commutes with scaling, you’ll see that the operations in (1) and (4) have exactly the same result; in terms of $(1)$ this is because $SA=AS$, since $S$ is a diagonal matrix.
Best Answer
This isn’t even generally possible in 2D. Take, for example, $S_1=I$, $S_2=\operatorname{diag}(2,1)$ and two 45-degree rotations. We then have $$T = \begin{bmatrix}\frac1{\sqrt2}&-\frac1{\sqrt2}\\\frac1{\sqrt2}&\frac1{\sqrt2}\end{bmatrix} \begin{bmatrix}2&0\\0&1\end{bmatrix} \begin{bmatrix}\frac1{\sqrt2}&-\frac1{\sqrt2}\\\frac1{\sqrt2}&\frac1{\sqrt2}\end{bmatrix} = \begin{bmatrix}\frac12&-\frac32\\\frac32&-\frac12\end{bmatrix}.$$ Now, suppose that this can be decomposed into a scaling along the coordinate axes followed by a rotation. Then there’s some diagonal matrix $D$ such that $TD$ is orthogonal. Setting $D=\operatorname{diag}(a,b)$, this means that we must have $$\begin{bmatrix}\frac12a&\frac32a\\-\frac32b&-\frac12b\end{bmatrix}\begin{bmatrix}\frac12a&-\frac32b\\\frac32a&-\frac12b\end{bmatrix} = \begin{bmatrix}\frac52a^2&-\frac32ab\\-\frac32ab&\frac52b^2\end{bmatrix} = I.$$ There is obviously no solution to this equation.
You do have the polar decomposition available, which lets you decompose $T$ into a scaling along some set of orthogonal directions followed by a rotation, but there’s no guarantee that those scaling directions will coincide with the coordinate axes, as demonstrated above. For that example, we can write $$T = \begin{bmatrix}0&-1\\1&0\end{bmatrix} \begin{bmatrix}\frac32&-\frac12\\-\frac12&\frac32\end{bmatrix}.$$ The right-hand matrix scales by a factor of $2$ in the direction of $(1,-1)$ and by a factor of $1$ in the direction of $(1,1)$, as you can verify by computing eigenvalues and eigenvectors, while the left-hand matrix is a 90-degree rotation.