I have no answer to your question, but some related examples.
Consider the q-Fibonacci polynomials defined by f(0, x, s)=0, f(1, x, s)=1 and f(n, x, s)=x f(n-1, x, s)+q^(n-2) s f(n-2, x, s).
Then the subsequences f(k n, x, s) satisfy a homogeneous recursion with rational coefficients which for k>2 are not polynomials (see e.g. my paper in arXiv 0806.0805).
More precisely
f(k, x, q^k s) f(k n, x, s) – f(2 k, x, s) f(k (n-1), x, q^k s)
+(-1)^k q^(k(3k-1)/2) s^k f(k, x, s) f(k (n-2), x, q^(2k) s)= 0
or equivalently
f(k, x, q^(n-2k) s) f(k n, x, s) – f(2 k, x, q^(k (n-2)) s) f(k (n-1), x, s)
+(-1)^k q^(-k (3k+1)/2) q^(k^2 n) s^k f(k, x, q^(k (n-1)) s) f(k (n-2), x, s)= 0.
Analogous results are true for powers of q-Fibonacci polynomials.
I may as well write down how I've been attacking this, although I don't have a solution yet. Sequences $(p_i)$ of nonnegative integers with sum $r$ are in bijection with weakly increasing $r$-tuples $(n_1, n_2, \ldots, n_r)$ of positive integers. Specifically, given the sequence $(p_0, p_1, p_2, \ldots)$, form the sequence of partial sums $(q_1, q_2, \ldots)$ given by $q_i:=p_0+p_1+\ldots+p_{i-1}$. Let $n_j$ be the minimal $i$ for which $q_i \geq j$. For example, $(1,0,0,1,0,0,2,0,0,\ldots)$ corresponds to $(1, 4, 7,7)$. So, we want to compute the sum over all weakly increasing $r$-tuples and prove it is equal to $1/r!$.
For each weakly increasing $r$-tuple, let us sum instead over all $r!$ permutations of the $r$-tuple. So we can view our sum as being over all $r$-tuples of nonnegative integers, and we want to prove now that the sum is $1$. One difficulty is that some $r$-tuples will appear more than once. For example, $(1,4,7,7)$ will appear twice, because the permutation which switches the $7$'s stabilizes this quadruple. It turns out that the multiplicity of an $r$-tuples is precisely $\prod (p_i)!$. So, what we want to show is that
$$\sum_{n_1, n_2, \ldots, n_r \geq 0} \frac{1}{\prod (p_i)! \prod \binom{p_{k-1}+p_k+k}{k}} =1$$
Now, when none of the $n_i$ are equal to each other, and when none of them differ by $1$, the summand is
$$\frac{1}{n_1(n_1+1)n_2(n_2+1) \cdots n_r(n_r+1)} = \left( \frac{1}{n_1} - \frac{1}{n_1+1} \right) \cdot \left( \frac{1}{n_2} - \frac{1}{n_2+1} \right) \cdots \left( \frac{1}{n_r} - \frac{1}{n_r+1} \right).$$
This is set up beautifully to telescope. If I could just find a similar nice description for when the $n_i$ collide or are adjacent ...
Best Answer
This is response to QUESTION 1.
As Fedor pointed out, we're dealing with the Chebyshev polynomials $P_n(2\cos t)=\sin nt/\sin t$. So we must show that if $$ \sum_{n=1}^N \sin nt = 0 , \quad\quad\quad\quad (1) $$ then also $\sum_{n=1}^N \sin^m nt = 0$ for any odd exponent $m\ge 1$.
We may take $0<t<\pi/2$. Also, the sum in (1) can of course be evaluated, and we find that (1) is equivalent to $$ \cos t/2 = \cos (N+1/2) t . \quad\quad\quad\quad (2) $$ I now claim that if (2) holds for $t$, then it also holds for any multiple of $t$. To see this, we just notice that (2) means that $s=t/2$ satisfies $(2N+1)s = 2\pi M\pm s$, for some $M\ge 1$ and a choice of sign (recall that $0<s<\pi/4$). In other words, (2) requires $s$ to be a rational multiple of $\pi$ with denominator $N$ or $N+1$, and clearly this property is preserved under taking integer multiples.
Now everything is clear: $\sin^m\alpha$ can be written as a linear combination of $\sin j\alpha$, with $j$ odd, and we have just seen that (1) implies that also $\sum_{n=1}^N \sin njt = 0$ for any odd $j$.