You want to find the Jordan Normal Form.
So, we are looking for an upper triangular matrix $J$ and an invertible matrix P s.t.
$J=P^{-1}AP$ where: $A = \pmatrix{1&1\\
-1&3}$
The characteristic equation is found by solving $(A-\lambda I) = 0$, resulting in characteristic polynomial $(\lambda-2)^2 = 0$, giving us a repeated eigenvalue, $\lambda = 2$.
To find the first eigenvector, we use the eigenvalue $\lambda = 2$ and solve $(A-\lambda I)v1 = 0$, resulting in the eigenvector $v1 = \langle1, 1\rangle$.
Note: I am not sure if you have learned or understand algebraic and geometric multiplicity yet, so I'll stay away from that terminology, but it is very important! See Geometric and Algebraic Multiplicity.
Next, we need to find another eigenvector, but because we have a repeated eigenvalue, this is typically called a generalized eigenvector (if it exists from the Note above).
To find the generalized eigenvector, we can form $(A-\lambda I)v2 = v1$, so
$\pmatrix{-1&1 \\
-1&1}\pmatrix{v21 \\ v22} = \pmatrix{1 \\ 1}$
$\implies v21 = v22-1$. Let $v22 = 0$, so $v21 = -1$, so the generalized eigenvector is $v2 = \langle-1, 0\rangle$.
Now, we can form $P = (v1 |\ v2) = \pmatrix{1&-1 \\ 1&0}$. That is, form the linear combination of $v1$ and $v2$.
Calculating $P^{-1} = \pmatrix{0&1 \\ -1&1}$, of course, this inverse is easy for a $2 x 2$ matrix since we have $P$.
Finally, we can calculate $J=P^{-1}AP = \pmatrix{2&1 \\ 0&2}$.
Do you notice anything special about the matrix $J$?
Let's verify these results using Wolfram Alpha.
Does this make sense and you follow?
Follow-Up Answer
How do you do this in general?
Not every matrix has enough linearly independent eigenvectors to be diagonalizable. However by using similarity transformations every square matrix can be transformed to the Jordan canonical form, which is almost diagonal.
See these wonderful notes (particularly at 7.2.1 for a sample 4x4, which shows all of the possibilities for a 4x4, but read it all and go through all of the examples with algebraic and geometric multiplicity), Jordan Canonical Form.
The general condition is the presence of nontrivial Jordon Blocks, see Jordan Normal Form- look at the 4x4 example and work it.
Also look at this interesting post Number of Jordan canonical forms for an nxn matrix.
There are more examples for you to review here.
Additionally, you might find these helpful.
How to determine the diagonalizability of these two matrices?
When is the geometric multiplicity of an eigenvalue smaller than its algebraic multiplicity?
Can an eigenvalue (of an $n$ by $n$ matrix A) with algebraic multiplicity $n$ have an eigenspace with fewer than $n$ dimensions?
See Jordan Form Section
Regards - Amzoti
Best Answer
If you don't know about JNF, here's another process which is easily generalizable. What I'm doing is simply following the constructive proof of Schur's Decomposition, (link provided below). Every now and then I'll choose random vectors that satisfy certain properties. There is an infinite number of choices for these vectors. You should pick your own while mimicing my answer. The $U$ and $P$ you'll end up with will probably be different.
Let $A=\begin{bmatrix} 3 & -1 & 1 \\ 2 & 0 & 0 \\ -1 & 1 & 3 \end{bmatrix}$.
You got only one eigenvector, namely $v_1:=\begin{bmatrix} 1 & 1 & 0\end{bmatrix}^T$.
Consider $P_1:=\begin{bmatrix} v_1 \mid v_2 \mid v_3\end{bmatrix}$ by columns. You want an invertible $P$ so just let $v_2, v_3$ be such that $P_1$ is invertible. An easy choice is $v_2:=\begin{bmatrix} 1 & -1 & 0\end{bmatrix}^T$ and $v_3:=\begin{bmatrix} 0 & 0 & 1\end{bmatrix}^T$. It's easy to see $P$ is invertible because its columns are orthogonal.
This yields $P_1^{-1}AP_1=\begin{bmatrix} 2 & 3 & 1/2 \\ 0 & 1 & 1/2 \\ 0 & -2 & 3 \end{bmatrix}$. It's not an upper triangular matrix. Let $B=P_1^{-1}AP_1$.
Suppose for a moment that there are matrices $P_2$ (invertible) and $T$ such that $P_2^{-1}BP_2=T$ where $T$ is an upper triangular matrix. This would yield $B=P_2TP_2^{-1}$ and $P_2TP_2^{-1}=P_1^{-1}AP_1$, thus giving $T=(P_2^{-1}P_1^{-1})A(P_1P_2)$.
So let's (try to) triangularize $B$.
Repeating the process wouldn't help, so let's instead try to triangularize $\color{grey}{B_1:=}\begin{bmatrix} 1 & 1/2 \\ -2 & 3\end{bmatrix}$. (Why? See Schur decomposition theorem's proof by induction here (page 12) ).
It's easy to check that $\left(2, \begin{bmatrix} 1\\ 2\end{bmatrix}\right)$ is an eigenpair of $B_1$ and that $B_1$ doesn't have any other linearly independent eigenvectors.
Define $P_{B_1}:=\begin{bmatrix}1 & -2\\2 & 1\end{bmatrix}$. The second column was chosen just to make $P_{B_1}$ invertible. There are, of course, other possibilities.
Then $P_{B_1}^{-1}B_1P_{B_1}=\begin{bmatrix}2 & 5/2\\ 0 & 2 \end{bmatrix}$.
Now it's possible to construct the aforementioned $P_2$. Let $P_2=\begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & -2\\ 0 & 2 & 1\end{bmatrix}$.
Block multiplication assures $P_2$ does the job.
Indeed $P_2^{-1}P_1^{-1}AP_1P_2=P_2^{-1}BP_2=\begin{bmatrix} 2 & 4 & -11/2\\ 0 & 2 & 5/2\\ 0 & 0 & 2\end{bmatrix}$.
So just let $U:=\begin{bmatrix} 2 & 4 & -11/2\\ 0 & 2 & 5/2\\ 0 & 0 & 2\end{bmatrix}$ and $P:=P_1P_2\color{grey}{=\begin{bmatrix} 1 & 1 & -2\\ 1 & -1 & 2\\ 0 & 2 & 1\end{bmatrix}}$.
Let's confirm it works. First find $P^{-1}=\begin{bmatrix}1/2 & 1/2 & 0\\ 1/10 & -1/10 & 2/5\\ -1/5& 1/5 & 1/5 \end{bmatrix}$.
Then $$\begin{align} P^{-1}AP&=\begin{bmatrix}1/2 & 1/2 & 0\\ 1/10 & -1/10 & 2/5\\ -1/5& 1/5 & 1/5 \end{bmatrix}\begin{bmatrix} 3 & -1 & 1 \\ 2 & 0 & 0 \\ -1 & 1 & 3 \end{bmatrix}\begin{bmatrix} 1 & 1 & -2\\ 1 & -1 & 2\\ 0 & 2 & 1\end{bmatrix}\\ &=\begin{bmatrix} 5/2 & -1/2 & 1/2\\-3/10 & 3/10 & 13/10\\ -2/5 & 2/5 & 2/5\end{bmatrix}\begin{bmatrix} 1 & 1 & -2\\ 1 & -1 & 2\\ 0 & 2 & 1\end{bmatrix}\\ &=\begin{bmatrix} 2 & 4 & -11/2\\ 0 & 2 & 5/2\\ 0 & 0 & 2\end{bmatrix}.\end{align}$$