For your first question, the identity matrix does the trick: any two vectors, orthogonal or not, are eigenvectors with eigenvalue 1.
More generally, any combination of two eigenvectors with the same eigenvalue $\lambda$ is itself an eigenvector (with eigenvalue $\lambda$); even if your two original eigenvectors are orthogonal, a linear combinations thereof will not be orthogonal to either one.
For the second question, a complex-valued matrix has real eigenvalues iff the matrix is Hermitian, which is to say that it is equal to the conjugate of its transpose: $A^\dagger = (A^T)^* = A$. So while your $A$ is not Hermitian, the matrix
$$
B = \begin{bmatrix} 1 & i \\ -i & 1 \end{bmatrix}
$$
is, and has two real eigenvalues (0 & 2).
So, I realise that the example I'm looking for is a symmetric matrix with at least one repeated eigenvalue for which there are no orthogonal eigenvectors.
If you ever get two of more linearly independent eigenvectors corresponding to the same eigenvalue, you can apply the Gram Schmidt process and end up with orthogonal vectors... which will continue being eigenvectors!
Thus, it is not possible to get any symmetric matrix which doesn't have orthogonal eigenvectors. (Note that I'm not saying that all eigenvectors will be orthogonal but that you can always find orthogonal ones.)
Best Answer
In this form it is not exactly true. For example, we can take a matrix that has no eigenspaces at all, like $$ A=\left(\begin{array}{ll}0&1\\-1&0\end{array}\right). $$ It is not symmetric, and technically all of its eigenspaces are orthogonal ))
However, the following is true: a real $n\times n$ square matrix $A$ is symmetric if and only if all of its eigenspaces are orthogonal and the sum of these eigenspaces is the whole $\mathbb{R}^n$. This condition is equivalent to saying that there is an orthonormal basis consisting of eigenvectors of $A$, and this is the statement from the post that you mentioned.
UPDATE: If you're interested in the same question but for matrices over $\mathbb{C}$ and using orthogonality that arises from the standard inner product $(a,b) = \sum a_i \overline{b}_i$, then the statement isn't true. The counterexample is the same matrix $A$ as above. It is not symmetric and not Hermitian, but it has two eigenspaces generated by vectors $(1, i)^T$ and $(1, -i)^T$ which are orthogonal.