It seems that you are done. Let
$ \det(sI-A)=a(s)=s^n+a_{n-1}s^{n-1}+ \cdots + a_1 s+a_0 $ be the characteristic polynomial. We know by assumption that it has exactly $n$ distinct roots, namely: $\lambda_1,\dots,\lambda_n$.
Let $ \Lambda = \begin{pmatrix}
\lambda_1 & 0 & \cdots & 0 \\
0 & \lambda_2 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & \lambda_n
\end{pmatrix}\text{ }$ be the diagonal matrix whose diagonal entries are the roots of $a(s)$.
We want to show that $ \Lambda^n + a_{n-1}\Lambda^{n-1}+ \cdots + a_0 I = 0. $ Since
$$\Lambda^k = \begin{pmatrix}
\lambda_1^k & 0 & \cdots & 0 \\
0 & \lambda_2^k & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & \lambda_n^k
\end{pmatrix},$$
thus $$\Lambda^n + a_{n-1}\Lambda^{n-1}+ \cdots + a_0 I=
\begin{pmatrix}
a(\lambda_1) & 0 & \cdots & 0 \\
0 & a(\lambda_2) & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & a(\lambda_n)
\end{pmatrix}=
\begin{pmatrix}
0 & 0 & \cdots & 0 \\
0 & 0 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & 0
\end{pmatrix}$$
because $\lambda_i$ are roots of $a(s)$.
The second part asks to prove the theorem for a matrix $A$ similar to $\Lambda$, i.e.
$$ A^n + a_{n-1}A^{n-1}+ \cdots + a_0 I = 0 .$$
Thus $A = T ΛT^{−1}$; where $Λ$ is diagonal, by definition of similarity; and
$$ A^k = T\Lambda^k T^{-1} \quad \forall k\in \Bbb N.$$
Finally: $$A^n + a_{n-1}A^{n-1}+\cdots+a_0 I = 0\Longleftrightarrow T \Lambda^n T^{-1} + a_{n-1}T \Lambda^{n-1} T^{-1}+\cdots + a_0 TT^{-1}= 0 $$
$$\Longleftrightarrow T( \Lambda^n + a_{n-1} \Lambda^{n-1} +\cdots + a_0I)T^{-1}= 0 \Longleftrightarrow \Lambda^n + a_{n-1} \Lambda^{n-1} +\cdots + a_0I =T^{-1}0T=0.$$
For a $2\times 2$ matrix with repeated eigenvalue, and only one linearly independent eigenvector, one can obtain a second solution as $\vec y_2(t)=te^{\lambda t}\vec v_1+e^{\lambda t}\vec v_2$, where $\vec v_1$ is the eigenvector for the matrix, and $\vec v_2$ is a "generalized eigenvector", which satisfies $(A-\lambda I)\vec v_2=\vec v_1$, or equivalently satisfies $(A-\lambda I)^2\vec v_2=0$. So the general solution becomes $$\vec y(t)=c_1e^{\lambda t}\vec v_1+c_2(te^{\lambda t}\vec v_1+e^{\lambda t}\vec v_2)$$
If you are familiar with the matrix exponential, then you could also solve this problem using $e^{At}$ as a fundamental matrix. Notice here that $A$ can be written as the sum of the identity matrix, and a nilpotent matrix (i.e. $N^k=0$ for some $k\in\Bbb N$), so $A=I+N$, and $e^{At}=e^{It}e^{Nt}$.
Best Answer
Since $p(x)$ has at least three roots $\lambda_1,\lambda_2,\lambda_3$, it may be written as $$p(x) = (x-\lambda_1)(x-\lambda_2)(x-\lambda_3)q(x)$$ where $q(x)$ is a polynomial with degree $3$ less than that of $p(x)$.
$A$ is diagonalizable, so its minimal polynomial splits into distinct linear factors, the roots of which are the eigenvalues of $A$. So the minimal polynomial $m_A(x) = (x-\lambda_1)(x-\lambda_2)(x-\lambda_3)$
Clearly, $m_A(x)$ divides $p(x)$. Moreover, $$p(x) = m_A(x)q(x)$$
We know that $m_A(A) = 0_n$, i.e. the zero matrix. This helps us conclude $p(A) = 0_n$ from the above relation!