Let $B_n(x)$ be the $n\times n$ matrix whose $(i,j)$-th entry is $x^{|i-j|}$. It is known that $B_n(x)$ is positive definite for $|x|<1$. Now the fact that $A_n$ is positive definite follows easily from $$A_n=\frac1{\Gamma(p)}\int_0^1(-\log x)^{p-1}B_n(x)\,dx.$$
Every symmetric, positive-definite matrix has a square root.
In particular, the root may be asked to be symmetric and positive-definite as well, and then it is uniquely determined.
For the given discretisation matrix $A^h$, which is highly structured, (t-)his square root
$$\sqrt{A^h} \;=\; \frac1{h^2}
\begin{pmatrix}
2 & -1 & 0 & &\\ -1 & 2 & -1 & & & \\
0 & -1 & \ddots & \ddots &\\ & & \ddots & \ddots & -1 & 0\\
& & & -1 & 2 & -1\\ & & & 0 & -1 & 2
\end{pmatrix}.$$
is not so far away, let's say by an educated guess which extends
$$\begin{pmatrix}
5 & -4\\ -4 & 5
\end{pmatrix} \;=\;
\begin{pmatrix}
2 & -1\\ -1 & 2
\end{pmatrix}^2\:.$$
And $\sqrt{A^h}$ is positive-definite:
Cf$\,$
https://en.wikipedia.org/wiki/Definite_matrix#Examples for the $3\times 3$ matrix. The method should generalise to higher dimensions.
And see here on this site:
The full statement including proof of $(7.4.7)$ Theorem starts on page 537 in https://zhilin.math.ncsu.edu/TEACHING/MA580/Stoer_Bulirsch.pdf .
An explicit Sum of squares expression as looked for in the OP is given by
$$(2u_1-u_2)^2 \,+\, \sum_{j=2}^{n-1}\,(-u_{j-1}+2u_j-u_{j+1})^2 \,+\, (2u_n-u_{n-1})^2\tag{SoS}$$
Having the knowledge of the square root $\sqrt{A^h}\,$ it results from expanding
\begin{align}
h^4\cdot\mathbf{u}^TA^h\,\mathbf{u} & \:=\:
\left(h^2\cdot\mathbf{u}^T\sqrt{A^h}\right) \left(h^2\cdot\sqrt{A^h}\,\mathbf{u}\right) \\[2ex]
& \:=\:
\big(2u_1-u_2, \dots, -u_{n-1}+2u_n\big)
\begin{pmatrix}
2u_1 -u_2\\ -u_1 +2u_2 -u_3\\
\vdots \\
-u_{n-2} +2u_{n-1} -u_n \\ -u_{n-1} +2u_n
\end{pmatrix}
\end{align}
The expression $(\text{SoS})$ gets zero only if $\mathbf{u}= \mathbf{0}$.
This is equivalent to the linear system $\sqrt{A^h}\,\mathbf{u} = \mathbf{0}$ having only the trivial solution.
Best Answer
HINT:
The matrix is symmetric and diagonally dominant, so positive semidefinite. To show that it is actually positive definite, you need to check that the kernel is null. Take a vector in the kernel. You notice that the first two components are equal, and then, the components are in arithmetic progression, but then notice the relation between the last two components. Thus the vector is $0$.
$\bf{Added:}$. Place $-\epsilon < 0$ instead of $-1$, and call the matrix $M_{\epsilon}$. For every $\epsilon \in [0,1)$ the matrix is strictly diagonally dominant, so the eigenvalues are not $0$. The eigenvalues of $M_{\epsilon}$ vary continuously with $\epsilon$. Now the eigenvalues of $M_{0}$ are $>0$, so no eigenvalue of $M= M_1$ can be negative.