What you did was fine as far as it went, but since $A$ is only $2\times 2$, you can also simply solve for the eigenvalues. If you do, you find yourself solving the quadratic equation $$(a-\lambda)(d-\lambda)-bc=\lambda^2-(a+d)\lambda+(ad-bc)=0\;,\tag{1}$$ so $$\lambda=\frac{a+d\pm\sqrt{(a+d)^2-4(ad-bc)}}2\;,$$ and the eigenvalue is repeated iff
$$\begin{align*}
0&=(a+d)^2-4(ad-bc)\\
&=a^2-2ad+d^2+4bc\\
&=(a-d)^2+4bc\;,
\end{align*}$$
i.e., iff $4bc=-(a-d)^2$. This guarantees that $bc\le 0$, but clearly $bc$ need not be $0$, and therefore $A$ need not be a diagonal matrix. Finally, it's clear from $(1)$ that $\det A$ is the product of the eigenvalues (even if you didn't know this already), so it's clear that $\det A\ge 0$ and therefore cannot assume all real values.
Recall that the columns of a transformation matrix are the images of the basis and that when you right-multiply a matrix by a vector, the result is a linear combination of the columns of the matrix with coefficients given by the components of the vector.
The second column of $A$ is $(0,1,0)^T$, so that standard basis vector gets mapped to itself: it is an eigenvector of $1$. The sum of the first and third columns is $(2,0,2)^T=2(1,0,1)^T$, so $(1,0,1)$ is an eigenvector of $2$. Since the sum of the eigenvalues is equal to the trace, you get the third eigenvalue for free: it’s $1+1+1-1-2=0$, but then, we already knew that $0$ is an eigenvalue because the matrix has two identical columns, therefore has a nontrivial null space. You can either compute a basis for this null space to find an eigenvector of $0$ or notice that because the first and third columns are identical, their difference, i.e., the product of the matrix with $(1,0,-1)^T$, is $0$.
Since the problem wants an orthogonal diagonalization—$SDS^T$ instead of $SDS^{-1}$—you’ll need to orthonormalize the above eigenvectors. They’re already mutually orthogonal (as eigenvectors of a symmetric real matrix of different eigenvalues are), so all you need to do is normalize them.
Best Answer
It can sometimes be faster and easier to look for eigenvectors first. When the rows are permutations of each other, the row sums are all equal, which means that the vector with all $1$s is an eigenvector with eigenvalue equal to this row sum. In this case, that’s $3u+v$.
Since the matrix is symmetric, it has an orthogonal eigenbasis, so looking at vectors orthogonal to $(1,1,1,1)^T$ could be fruitful. A simple thing to try is differences of pair of columns, which leads to $(1,0,0,-1)^T$ and $(0,1,-1,0)^T$, both with eigenvalue $u-v$.
The last eigenvalue comes “for free:” the sum of the eigenvalues is equal to the trace, so the last eigenvalue is $v-u$. If we assume that $v\ne u$, this eigenvalue is distinct from the others, so we look for a vector that’s orthogonal to the three already found. One such is $(1,-1,-1,1)^T$.