By the spectral theorem, given a symmetric matrix $A$, there exists an orthonormal basis of eigenvectors of $A$. So, yes, a symmetric matrix always has orthogonal eigenvectors.
For example, the identity matrix has its only (repeated) eigenvalue as $1$, yet there does exist an orthonormal basis of eigenvectors of the identity matrix, regardless of the repeated eigenvalue.
In summary, you're correct to say that you can always find a basis of orthogonal eigenvectors given a symmetric matrix, regardless of whether the eigenvectors correspond to repeated eigenvalues.
No, there are plenty of matrices with repeated eigenvalues which are diagonalizable. The easiest example is
$$
A=\begin{bmatrix}1&0\\0&1\end{bmatrix}.
$$
The identity matrix has $1$ as a double eigenvalue and is (already) diagonal. If you want to write this in diagonalized form, you can write
$$
A=I^{-1}AI=I^{-1}II
$$
since $A$ is a diagonal matrix.
In general, $2\times 2$ matrices with repeated eigenvalue $\lambda$ are diagonalizable if and only if the eigenspace corresponding to $\lambda$ is two dimensional. In other words, if
$$
A-\lambda I=\begin{bmatrix}a-\lambda&b\\c&d-\lambda\end{bmatrix}
$$
has a two-dimensional null space. Using the rank-nullity theorem, we get that this happens exactly when the matrix has $0$ pivots. If $A-\lambda I$ has any nonzero entries, then it will have a pivot. Therefore, a $2\times 2$ matrix with repeated eigenvalues is diagonalizable if and only if it is $\lambda I$.
If $B$ is an $n\times n$ matrix, all of whose eigenvalues are $\lambda$, a similar result holds. A sneakier way to prove this is that if $B$ is diagonalizable, then
$$
B=P^{-1}(\lambda I)P=\lambda P^{-1}IP=\lambda I,
$$
where P is an invertible (basis changing) matrix.
Therefore, the only $n\times n$ matrices with all eigenvalues the same and are diagonalizable are multiples of the identity.
If only some of $B$'s eigenvalues have multiplicity, then the situation becomes more complicated and you really need to compute the dimensions of all the eigenspaces.
As the other posters comment, there are diagonal matrices which are not multiples of the identity, for example
$$
\begin{bmatrix}1&0\\0&2\end{bmatrix}
$$
and if all the eigenvalues of a matrix are distinct, then the matrix is automatically diagonalizable, but there are plenty of cases where a matrix is diagonalizable, but has repeated eigenvalues.
Best Answer
I guess you are under the assumption that your matrices have real coefficients. Therefore if $z$ is a root of the characteristic polynomial, then also $\bar{z}$ is a root.
If $z\in\mathbb{C}$ and $z\notin\mathbb{R}$, then $z\ne\bar{z}$. If the double root is complex not real, you would have three complex non real roots: this is a contradiction, because a degree $3$ polynomial (with real coefficients) has at least a real root.
In a different wording: if the roots are $\lambda$ and $a\pm bi$, with $b\ne0$, then these roots are distinct.
Assume now that the eigenvalues are $\lambda\in\mathbb{R}$ and $a\pm bi\in\mathbb{C}$, with $b\ne0$. If $f\colon\mathbb{R}^3\to\mathbb{R}^3$ is defined by $f(v)=Av$ (where $A$ is the given matrix), then you want to find a basis such that \begin{align} f(v_1)&=av_1-bv_2\\ f(v_2)&=bv_1+av_2\\ f(v_3)&=\lambda v_3 \end{align} The choice of $v_3$ is obvious: it must be an eigenvector relative to $\lambda$.
Take an eigenvector $w$ relative to $a+bi$ and split it as $w=v_1+iv_2$, where $v_1,v_2\in\mathbb{R}^3$. Then $Aw=(a+bi)w$ translates into $$ Av_1+iAv_2=(av_1-bv_2)+i(bv_1+av_2) $$ Can you finish?