"one dimensional" refers to the dimension of the space of eigenvectors for a particular eigenvalue. All the eigenvectors corresponding to the eigenvalue -1 are multiples of $x_1$. In other words, they are spanned by one vector, so the space of eigenvectors has dimension one.
Do not confuse this with the fact that $x_1 = (1,1)$ has 2 coordinates. That just means that $x_1$ lives in a 2 dimensional space $\mathbb{R}^2$ (or whatever your field is). The space of eigenvectors is a subspace of that 2 dimensional space, and that subspace is 1 dimensional.
There seems to be several points of confusion here, so let me ignore your questions for now and start from the top.
The eigenvalues of a matrix $A$ are defined as the set of values $\lambda$ for which the matrix $A-\lambda I$ is singular. Put another way, the eigenvalues of the matrix $A$ are the set of values $\lambda$ for which
$$p(\lambda) = \det(A-\lambda I) = 0$$
The expression $\det(A-\lambda I)$ is called the characteristic polynomial of $A$ and the eigenvalues are defined to be the roots of this polynomial. In general, the characteristic polynomial of an $n\times n$ matrix is an $n$th degree polynomial which means that there will be (at most) $n$ roots of the polynomial. The set of eigenvalues is what we call the spectrum of $A$. The spectrum is the set of values which appears on the diagonal of your diagonal matrix. These values are unique but only up to order.
So let me now address your question: "Are the eigenvalues of a matrix unique?" Well that's a bit difficult to answer because the question is not formulated well. If a matrix has $n$ distinct eigenvalues, would you consider each eigenvalue to be unique (in the sense of multiplicity one)? If a matrix has only a single eigenvalue of multiplicity $n$, would you consider that to be unique?
In either case, the answer to your question would be no. A matrix does not necessarily have distinct eigenvalues (although almost all do), and a matrix does not necessarily have a single eigenvalue with multipicity $n$. In fact, given any set of $n$ values, you can construct a matrix with those values as eigenvalues (indeed just take the corresponding diagonal matrix).
Now onto eigenvectors. For each eigenvalue $\lambda$, there exists a subspace of vectors $E_\lambda$ which satisfies the equation
$$A\mathbb{v} = \lambda\mathbb{v}$$
for $\mathbb{v}\in E_\lambda$. Now this eigenspace $E_\lambda$ is unique, but the vectors in the space, the eigenvectors are not unique. It is analogous to the fact that you can talk about there being a unique $x$-axis, but it makes no sense to talk about a unique point on the $x$-axis.
What is true is that the eigenspaces of different eigenvalues are independent, so that eigenvectors of different eigenvalues are linearly independent. When your matrix is diagonalizable, the collection (or direct sum if you are familiar with the term) of these eigenspaces is your entire vector space. This means that there exists a basis of solely eigenvectors and your matrix $S$ is formed from a basis of eigenvectors as its columns. Of course, each eigenspace is in fact a subspace, so linear combinations of eigenvectors remain eigenectors. This is why you can multiply your eigenvectors by scalar multiples and still have them remain eigenvectors. In fact, you are free to choose any basis for the eigenspace and your matrix $S$ will correspondingly be modified.
Best Answer
The matrices $AMA^{-1}$ and $M^2$ have the same eigenvalues. In particular, they have the same non-null eigenvalues. Therefore, if $\lambda$ is a non-null eigenvalue of $M$, then so is $\lambda^2$. But then so is $\lambda^4$ and so on. On the other hand, $M$ has only a finite number of eigenvalues. Therefore $\lambda^{2^k}=\lambda^{2^l}$ for some $k,l\in\mathbb{N}$ with $k>l$. So, $\lambda^{2^l}(\lambda^{2^k-2^l}-1)=0$. But $\lambda^{2^l}\neq0$ and therefore $\lambda^{2^k-2^l}=1$.