If you don't know about JNF, here's another process which is easily generalizable. What I'm doing is simply following the constructive proof of Schur's Decomposition, (link provided below). Every now and then I'll choose random vectors that satisfy certain properties. There is an infinite number of choices for these vectors. You should pick your own while mimicing my answer. The $U$ and $P$ you'll end up with will probably be different.
Let $A=\begin{bmatrix} 3 & -1 & 1 \\ 2 & 0 & 0 \\ -1 & 1 & 3 \end{bmatrix}$.
You got only one eigenvector, namely $v_1:=\begin{bmatrix} 1 & 1 & 0\end{bmatrix}^T$.
Consider $P_1:=\begin{bmatrix} v_1 \mid v_2 \mid v_3\end{bmatrix}$ by columns. You want an invertible $P$ so just let $v_2, v_3$ be such that $P_1$ is invertible. An easy choice is $v_2:=\begin{bmatrix} 1 & -1 & 0\end{bmatrix}^T$ and $v_3:=\begin{bmatrix} 0 & 0 & 1\end{bmatrix}^T$. It's easy to see $P$ is invertible because its columns are orthogonal.
This yields $P_1^{-1}AP_1=\begin{bmatrix} 2 & 3 & 1/2 \\ 0 & 1 & 1/2 \\ 0 & -2 & 3 \end{bmatrix}$. It's not an upper triangular matrix. Let $B=P_1^{-1}AP_1$.
Suppose for a moment that there are matrices $P_2$ (invertible) and $T$ such that $P_2^{-1}BP_2=T$ where $T$ is an upper triangular matrix. This would yield $B=P_2TP_2^{-1}$ and $P_2TP_2^{-1}=P_1^{-1}AP_1$, thus giving $T=(P_2^{-1}P_1^{-1})A(P_1P_2)$.
So let's (try to) triangularize $B$.
Repeating the process wouldn't help, so let's instead try to triangularize $\color{grey}{B_1:=}\begin{bmatrix} 1 & 1/2 \\ -2 & 3\end{bmatrix}$. (Why? See Schur decomposition theorem's proof by induction here (page 12)
).
It's easy to check that $\left(2, \begin{bmatrix} 1\\ 2\end{bmatrix}\right)$ is an eigenpair of $B_1$ and that $B_1$ doesn't have any other linearly independent eigenvectors.
Define $P_{B_1}:=\begin{bmatrix}1 & -2\\2 & 1\end{bmatrix}$. The second column was chosen just to make $P_{B_1}$ invertible. There are, of course, other possibilities.
Then $P_{B_1}^{-1}B_1P_{B_1}=\begin{bmatrix}2 & 5/2\\ 0 & 2 \end{bmatrix}$.
Now it's possible to construct the aforementioned $P_2$. Let $P_2=\begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & -2\\ 0 & 2 & 1\end{bmatrix}$.
Block multiplication assures $P_2$ does the job.
Indeed $P_2^{-1}P_1^{-1}AP_1P_2=P_2^{-1}BP_2=\begin{bmatrix} 2 & 4 & -11/2\\ 0 & 2 & 5/2\\ 0 & 0 & 2\end{bmatrix}$.
So just let $U:=\begin{bmatrix} 2 & 4 & -11/2\\ 0 & 2 & 5/2\\ 0 & 0 & 2\end{bmatrix}$ and $P:=P_1P_2\color{grey}{=\begin{bmatrix} 1 & 1 & -2\\ 1 & -1 & 2\\ 0 & 2 & 1\end{bmatrix}}$.
Let's confirm it works. First find $P^{-1}=\begin{bmatrix}1/2 & 1/2 & 0\\ 1/10 & -1/10 & 2/5\\ -1/5& 1/5 & 1/5 \end{bmatrix}$.
Then $$\begin{align} P^{-1}AP&=\begin{bmatrix}1/2 & 1/2 & 0\\ 1/10 & -1/10 & 2/5\\ -1/5& 1/5 & 1/5 \end{bmatrix}\begin{bmatrix} 3 & -1 & 1 \\ 2 & 0 & 0 \\ -1 & 1 & 3 \end{bmatrix}\begin{bmatrix} 1 & 1 & -2\\ 1 & -1 & 2\\ 0 & 2 & 1\end{bmatrix}\\
&=\begin{bmatrix} 5/2 & -1/2 & 1/2\\-3/10 & 3/10 & 13/10\\ -2/5 & 2/5 & 2/5\end{bmatrix}\begin{bmatrix} 1 & 1 & -2\\ 1 & -1 & 2\\ 0 & 2 & 1\end{bmatrix}\\
&=\begin{bmatrix} 2 & 4 & -11/2\\ 0 & 2 & 5/2\\ 0 & 0 & 2\end{bmatrix}.\end{align}$$
Best Answer
What you have done so far appears correct to me.
Here is the idea without resorting to diagonalization: the $v_j$ form a basis , and so you can write every $x = \sum c_j v_j$. By linearity,
$$ Ax = A(\sum c_j v_j) = \sum c_j Av_j = \sum c_j \lambda_j v_j. $$
Then
$$ \left< x,Ax\right> = \left< \sum c_i v_i ,A\sum c_j v_j \right> = \left< \sum c_i v_i ,\sum c_j \lambda_j v_j \right> = \sum_{i,j}c_i c_j \lambda_j \left<v_i,v_j\right> = \sum_j c_j^2 \lambda_j $$
since $\left<v_i,v_j\right> = 1$ if $i = j$ and $0$ otherwise. In particular, how can you choose the $c_j$ so that $\sum (c_j)^2 \lambda_j$ is at its maximum or minimum?
If you want to know how this relates to diagonalization (something you should eventually learn about), $A$ being symmetric ensures it has orthogonal eigenvectors, and that you can diagonalize $A$
$$ A = QDQ^T $$
where the columns of $Q$ are your eigenvectors, and $D$ is a diagonal matrix with the eigenvalues on the diagonal. Now, take any $x$, and write $x = \sum c_j v_j$ where $v_j$ are the eigenvectors you found. Note that $Q^Tv_j = e_j$, the standard basis vectors. Also $De_j = \lambda_j e_j$ and $Qe_j = v_j$. Thus,
$$ x^TAx = x^TQDQ^Tx = x^TQD\left(\sum c_j e_j\right) \\ = x^TQ\left(\sum \lambda_j c_j e_j\right) \\ = x^T\left(\sum \lambda_j c_j v_j\right) \\ = \sum (c_j)^2 \lambda_j. $$
You must have $\sum (c_j)^2 = 1$ so that $x$ is in the unit sphere (this is only required because a quadratic form does not have a global max or min: $\left<\alpha x , A(\alpha x) \right> = \alpha^2 \left<x,Ax\right>$, and so we just fix a particular scale ($1$ in this case)).