I can show that there is no universal formula. More precisely, let $k$ be an algebraically closed field and let $R = k[w,x,y,z,\Delta]/(\Delta^2-wz+xy)$. I claim that there do not exist $2 \times 2$ matrices $A_1$, $A_2$, ..., $A_{2t}$ with entries in $R$, such that every matrix occurs an even number of times in the list up to transpose, and $\left( \begin{smallmatrix} w & x \\ y & z \end{smallmatrix} \right) = A_1 A_2 \cdots A_{2t}$. In fact, we are going to show something much stronger: It is impossible to write $\left( \begin{smallmatrix} w & x \\ y & z \end{smallmatrix} \right)$ as $MN$ for two noninvertible $2 \times 2$ matrices with entries in $R$.
Since $R$ is a graded integral domain, $\det M$ and $\det N$ would have to be of pure degree. If $\det M$ has degree $0$ then $M$ is invertible. So we must have $\det M$ and $\det N$ of degree $1$.
Write $M=M_0 + M_1 + \cdots$ where $M_k$ is pure of degree $k$, and likewise for $N$. So $$\left( \begin{smallmatrix} w & x \\ y & z \end{smallmatrix} \right) = M_0 N_1 + M_1 N_0.$$
Since $\det M$ has no constant term, we have $\det M_0 =0$ and, similarly, $\det N_0 =0$. Let $\left( \begin{smallmatrix} u_1 & u_2 \end{smallmatrix} \right) \in k^2$ be in the left kernel of $M_0$ and let $\left( \begin{smallmatrix} v_1 \\ v_2 \end{smallmatrix} \right) \in k^2$ be in the right kernel of $N_0$. So the above equation shows that $\left( \begin{smallmatrix} u_1 & u_2 \end{smallmatrix} \right) \left( \begin{smallmatrix} w & x \\ y & z \end{smallmatrix} \right) \left( \begin{smallmatrix} v_1 \\ v_2 \end{smallmatrix} \right) =0$.
But there are no nonzero vectors in $k^2$ for which $\left( \begin{smallmatrix} u_1 & u_2 \end{smallmatrix} \right) \left( \begin{smallmatrix} w & x \\ y & z \end{smallmatrix} \right) \left( \begin{smallmatrix} v_1 \\ v_2 \end{smallmatrix} \right) =0$, a contradiction.
UPDATE: Pushing this argument further, I can show that $\begin{pmatrix} 0 & u & v & w \\ -u & 0 & x & y \\ -v & -x & 0 & z \\ -w & -y & -z & 0 \end{pmatrix}$ cannot be expressed as $MN$ with $\det M$ and $\det N$ nonconstant. Proof: Abbreviate the above matrix to $S$. As before, deduce that we can write $M_0 N_0 =0$ and $M_0 N_1 + M_1 N_0 = S$ with $\det M_0 = \det N_0 = 0$. So $\mathrm{rank} M_0 + \mathrm{rank} N_0 \leq 4$. Without loss of generality, suppose that $M_0$ has nullity at least $2$ and $N_0$ has nullity at least $1$. So, as before, we can find linearly independent constant vectors $p$ and $q$ with $p^T M_0=q^T M_0 =0$ and a nonzero vector $r$ with $N_0 r=0$. We have $p^T S r = q^T S r = 0$ as before.
Let $p=(p_1, p_2, p_3, p_4)$ and $r=(r_1, r_2, r_3, r_4)$. Looking at the coefficient of $u$ in $p^T S r$, we get $p_1 r_2 = p_2 r_1$. Looking at $v$, $w$, $x$, $y$ and $z$, we see that $p_i r_ = p_j r_i$ for all $(i,j)$, so $p$ and $r$ are proportional. Similarly, $q$ and $r$ are proportional, so $p$ and $q$ are proportional, contradicting that we choose them linearly independent. QED.
I have no $2 \times 2$ counter-example.
Yes, the characteristic polynomial is given by $(-1)^m U_{2m}(1/2\lambda ) \lambda^{2m} $.
The inverse matrix is given by $$\begin{pmatrix} 2 & -1 & 0 & 0 & \dots \\ -1 & 2 & -1 & 0 & \dots \\ 0 & -1 & 2 & -1 & \dots \\ 0 & 0 & -1 & 2 & \dots % \\ 0 & 0 & 0 & -1 & \dots
\\ \dots & \dots & \dots & \dots & 1\end{pmatrix}$$
where the $1$ in the bottom
-right corner denotes that the bottom-right entry (and no other diagonal entry) is a $1$. (This can be seen by writing your matrix as $A^T A$, where $A$ has $1$s on the diagonal and upper triangle and $0$s in the lower triangle, and taking the inverse of $A$.)
The characteristic polynomial is $Q_m(\lambda) + Q_{m-1}(\lambda)$, where $Q_m$ is the characteristic polynomial of the $m \times m$ matrix with $2$s on the diagonal, $-1$s adjacent to the diagonal, and $0$s elsewere.
Laplace expansion gives $$Q_m(\lambda)= (\lambda-2) Q_{m-1}(\lambda) - Q_{m-2} (\lambda)$$ which gives the generating function $$\sum_{m=0}^{\infty} Q_m(\lambda) t^m = \frac{1}{ 1 - (\lambda -2) t + t^2} $$ so the characteristic polynomial of this matrix has the generating function $$\frac{1+t }{ 1 - (\lambda -2) t + t^2}.$$
Since the determinant is $1$, we can obtain the characteristic polynomial of the inverse matrix by substituting $\lambda^{-1}$ for $\lambda$ and multiplying by $(-\lambda)^m$, i.e. substituting $(-\lambda t)$ for $t$, getting
$$\frac{1 - \lambda t }{ 1+ (1 -2\lambda ) t + \lambda^2 t^2}$$
as the generating function for the characteristic polynomial of your matrix.
The Chebyshev polynomial has the generating function $$ \sum_{n=0}^{\infty} U_n(x) t^n = \frac{1}{ 1- 2x t+ t^2}$$ so
$$ \sum_{m=0}^{\infty} U_{2m}(x) t^{2m}=\frac{1}{2} \left( \frac{1}{ 1- 2x t+ t^2} + \frac{1}{ 1+ 2xt + t^2}\right) = \frac{ 1 + t^2 } { 1 + 2t^2 + t^4 - 4 x^2 t^2 } $$
and thus
$$ \sum_{m=0}^{\infty} (-1)^m U_{2m}(1/2\lambda ) \lambda^{2m} t^{2m} = \frac{ 1 - \lambda^2 t^2 } { 1 - 2\lambda^2 t^2 + \lambda^4 t^4 + t^2 } $$
which is the same after substituting $\lambda^2$ for $\lambda$ and $t^2$ for $t$.
Best Answer
The statement is false if $P^*$ is taken to mean the element by element complex conjugate of $P(\lambda)$. A counterexample: let $m=1$, $\omega_0 = \omega_1 = 1$, and $$A_0 = \left( \begin{array}{cc} 0&i\\2i&0 \end{array} \right) \\ A_1 = \left( \begin{array}{cc} 1&i\\0&2 \end{array} \right) $$ Then coefficients in $p(x,y)$ come out to be complex non-real.
You must have meant $P^\dagger(\lambda)P(\lambda)$, the Hermitian conjugate of the matrix. With that change:
$t$ is a real polynomial in the variable $\lambda = \sqrt{x^2+y^2}$.
Proposition 1: $\forall n \in \Bbb{N} : \lambda^n $ is either a polynomial in $x^2$ and $y^2$ or (if $n$ is odd) $\sqrt{x^2+y^2}$ times a polynomial in $x^2$ and $y^2$.
Since all the $\omega_m$ are real, we have by proposition 1 that $t = Q(|\lambda| = Q(\sqrt{x^2+y^2}$ is of the form $$ t = P_1(x,y) + \sqrt{x^2+y^2}P_2(x,y) $$ where $P_i(x,y)$ are both real polynomials.
Now for a given set of $A_m$, each element of $P(\lambda)$ is a (possibly complex) polynomial in $(x,y)$. But each element of $P^\dagger(\lambda)P(\lambda)$ is a real-valued, thus it is a real polynomial in $(x,y)$.
Then each element of $tI - P^\dagger(\lambda)P(\lambda)$ is a real polynomial in $(x,y)$ plus, for diagonal elements, an expression of the form $P_1(x,y) + \sqrt{x^2+y^2}P_2(x,y)$.
So each element of $tI - P^\dagger(\lambda)P(\lambda)$ is of the form $P_1(x,y) + \sqrt{x^2+y^2}P_2(x,y)$
Finally, the determinant of a matrix is a polnomial function of all of its elements. This brings us home, because any polynomial function of elements of the form $P_1(x,y) + \sqrt{x^2+y^2}P_2(x,y)$ is itself of the form $P_3(x,y) + \sqrt{x^2+y^2}P_4(x,y)$. Identify in your problem $q(x,y)$ with $P_3$ and p(x,y) with $P_4$.
By the way, if only even powers appear in $Q$, then $P_2(x,y) = 0$ since every term ins itself a polynomial in $x^2+y^2)$. Since the off-diagonal elements are also pure polynomials in $x$ and $y$, in that situation, $p(x,y) = 0$.