I doubt that this is the best approach since it basically only uses the definition of positive semidefinite (PSD) matrices and requires a fact about the pointwise product of PSD matrices, but maybe it will give you some ideas.
We have that,
$$
\sin(x_i-x_j)^2 = \frac{1}{2} - \frac{1}{2}\cos(2(x_i-x_i))
= \frac{1}{2} - \frac{1}{2}\big[ \cos(2x_i)\cos(2x_j) + \sin(2x_i)\sin(2x_j) \big]
$$
Therefore we can write,
$$
A_{i,j} = e^{-\lambda/2}\exp\left(\frac{\lambda}{2}\cos(2x_i)\cos(2x_j) +\frac{\lambda}{2}\sin(2x_i)\sin(2x_j) \right)
$$
Now, note that the matrix,
$$
B_{i,j} = \frac{\lambda}{2}\cos(2x_i)\cos(2x_j) + \frac{\lambda}{2}\sin(2x_i)\sin(2x_j)
$$
is the sum of two rank-1 outer products and therefore PSD.
Note: Maybe for your needs its sufficient to show the part of the kernel in the exponent is the sum of separable functions, in which case you are done.
Using Taylor expansion we have,
$$
A_{i,j}
%= e^{-\lambda/2} \sum_{k=0}^{\infty} \frac{1}{k!} \left( \frac{\lambda}{2}\cos(2x_i)\cos(2x_j) + \frac{\lambda}{2}\sin(2x_i)\sin(2x_j) \right)^k
= e^{-\lambda/2} \sum_{k=0}^{\infty} \frac{1}{k!} B_{i,j}^k
$$
So in matrix form, using $\circ$ to denote the poitwise product of matrices,
$$
A = e^{-\lambda/2} \left( I + B + \frac{1}{2!} B\circ B + \frac{1}{3!} B\circ B\circ B + \cdots \right)
$$
Finally, we claim that this matrix is PSD.
First, note that the pointwise product of two PSD matrices is again PSD. Therefore, each of the terms $B\circ B\circ \cdots \circ B$ is PSD. The sum of PSD matrices is also PSD.
This is an infinite series of pointwise additions and products of PSD matrices. If this converges (which you know it does), it will also be PSD.
A quick way of seeing this might be to note that $$\int_0^\infty \frac{e^{-t} - e^{-at}}{t}\,dt = \log a \,, \forall a > 0.$$
So the quadratic form above can be written as \begin{align*} \sum\limits_{i,j=1}^{n} c_ic_j \log(x_i + x_j) &= \sum\limits_{i,j=1}^{n} c_ic_j \int_0^\infty \frac{e^{-t} - e^{-(x_i+x_j)t}}{t}\,dt \\&= \int_0^\infty \frac{1}{t}\left(e^{-t}\left(\sum_{j=1}^n c_j\right)^2 - \left(\sum_{j=1}^n c_je^{-x_jt}\right)^2\right)\,dt \\&= -\int_0^{\infty} \frac{1}{t} \left(\sum_{j=1}^n c_je^{-x_jt}\right)^2 \,dt \le 0\end{align*} where, we used the fact that $\displaystyle \sum_{j=1}^n c_j = 0$.
Best Answer
Hint (without integrals): Let $X=\operatorname{diag}(x_1,x_2,\ldots,x_n)$ and $e$ be the vector of all ones.