I have seen a cat of a similar breed in the representation theory of symmetric groups. Out of habit, let me quote a lemma attributed to Littlewood in
Donald Knutson, $\lambda$-rings and the Representation Theory of the Symmetric Group, Springer 1973 (LNM #308), Chapter III, section 2, p. 149:
$\sum\limits_{\sigma\in S_n} f\left(x_{\sigma\left(1\right)},x_{\sigma\left(2\right)},...,x_{\sigma\left(n\right)}\right) = \frac{1}{x_1x_2...x_n}$.
At the moment, neither does this cat imply yours, nor the other way round. But can we cross them?
Let me try. The left paw side of your cat is $\sum\limits_{\sigma\in \mathrm{Sh}\left(1,n-1\right)} f\left(x_{\sigma^{-1}\left(1\right)},x_{\sigma^{-1}\left(2\right)},...,x_{\sigma^{-1}\left(n\right)}\right)$, where $\mathrm{Sh}\left(a,b\right)$ is defined as the subgroup
$\left\lbrace \sigma \in S_{a+b} \mid \sigma\left(1\right) < \sigma\left(2\right) < ... < \sigma\left(a\right) \text{ and } \sigma\left(a+1\right) < \sigma\left(a+2\right) < ... < \sigma\left(a+b\right) \right\rbrace$
of the symmetric group $S_{a+b}$. (The elements of this subgroup $\mathrm{Sh}\left(a,b\right)$ are known as $\left(a,b\right)$-shuffles.) Now I suspect tat
$\sum\limits_{\sigma\in \mathrm{Sh}\left(a,b\right)} f\left(x_{\sigma^{-1}\left(1\right)},x_{\sigma^{-1}\left(2\right)},...,x_{\sigma^{-1}\left(a+b\right)}\right) = f\left(x_1,x_2,...,x_a\right) f\left(x_{a+1},x_{a+2},...,x_{a+b}\right)$
for any $a$ and $b$ and any $x_i$.
This generalizes your cat. Does it generalize Littlewood's? Yes, at least if we generalize it even further, to the so-called $\left(a_1,a_2,...,a_k\right)$-multishuffles (which are permutations $\sigma\in S_{a_1+a_2+...+a_k}$ increasing on each of the intervals $\left[a_i+1,a_{i+1}\right]$, where $a_0=0$ and $a_{k+1}=n$). This is not much of a generalization, since it follows from the $\left(a,b\right)$-shuffle version by induction over $k$, but applying it to $\left(1,1,...,1\right)$-multishuffles (which are simply all the elements of $S_n$) yields Littlewood's cat.
Now I see that Littlewood's cat even follows from yours, if we notice that every permutation $\sigma\in S_n$ can be written uniquely as a product $t_1t_2...t_{n-1}$, where each of the $t_k$ moves the $k$ some places to the right. (This is one of the stupid sorting algorithms.)
Oh, and I don't have a proof of my cat, but it can catch mice, so it's a good cat, isn't it?
Sam is correct of course about $q$-hook formula. Below is a short self-contained proof not relying on such advanced combinatorics.
Denote $h_1>\ldots>h_k$ the set of hook lengths of the first column of diagram $\lambda$. Then the multiset of hooks is $\cup_{i=1}^k \{1,2,\ldots,h_i\}\setminus \{h_i-h_j:i<j\}$ and $n=\sum_i h_i-\frac{k(k-1)}2$.
Recall that $F(m)=P_m(\alpha,\beta)=\prod_{d|m,d>1}\Phi_d(\alpha,\beta)=\prod_d (\Phi_d(\alpha,\beta))^{\eta_d(m)}$, where
$\alpha,\beta=(1\pm \sqrt{5})/2$;
$P_n(x,y)=x^{n-1}+x^{n-2}y+\ldots+y^{n-1}$;
$\Phi_d$ are homogeneous cyclotomic polynomials;
$\eta_d(m)=\chi_{\mathbb{Z}}(m/d)$ (i.e., it equals to 1 if $d$ divides $m$, and to 0 otherwise).
Therefore it suffices to prove that for any fixed $d>1$ we have
$$
\sum_{m=1}^n \eta_d(m)+\sum_{i<j}\eta_d(h_i-h_j)-\sum_{i=1}^k\sum_{j=1}^{h_i}\eta_d(j)\geqslant 0.\quad (\ast)
$$
$(\ast)$ rewrites as
$$
[n/d]+|i<j:h_i\equiv h_j \pmod d|-\sum_{i=1}^k [h_i/d]\geqslant 0.\quad (\bullet)
$$
LHS of $(\bullet)$ does not change if we reduce all $h_i$'s modulo $d$ (and accordingly change $n=\sum_i h_i-\frac{k(k-1)}2$, of course), so we may suppose that $0\leqslant h_i\leqslant d-1$ for all $i$. For $a=0,1,\dots, d-1$ denote $t_a=|i:h_i=a|$. Then $(\bullet)$ rewrites as
$$
\left[\frac{-\binom{\sum_{i=0}^{d-1} t_i}2+\sum_{i=0}^{d-1} it_i}d\right]+
\sum_{i=0}^{d-1} \binom{t_i}2\geqslant 0. \quad (\star)
$$
It remains to observe that LHS of $(\star)$ equals to
$$
\left[\frac1d\sum_{i<j}\binom{t_i-t_j}2 \right].
$$
Best Answer
For power series $u(x_1,\ldots,x_n),v(x_1,\ldots,x_n)$ call $u,v$ similar and write $u\sim v$ if all monomials $\prod x_i^{c_i}$ with $c_i\in \{0,1\}$ have equal coefficients in $u,v$. In other words, if $u$ is congruent to $v$ modulo the ideal generated by $x_i^2$'s. Note that this similarity respects addition and multiplucation, and that $(1-x_i)^{-1}\sim \exp(x_i)$ and $(1-x_i-x_j)^{-1}\sim 1+x_i+x_j+2x_ix_j\sim\exp(x_i+x_j+x_ix_j)$. Thus \begin{align*} {\rm CT}\, F&= [x_1\ldots x_n] \prod_i (1-x_i)^{-2}\prod_{i<j}(1-x_i-x_j)^{-1}\\&= [x_1\ldots x_n]\prod_i\exp(2x_i)\prod_{i<j}\exp(x_i+x_j+x_ix_j)\\&=[x_1\ldots x_n] \exp\left( \sum 2x_i+\sum_{i<j}(x_i+x_j+x_ix_j) \right)\\ &=[x_1\ldots x_n]\exp\left((n+1)S+S^2/2\right), \end{align*} where $S=x_1+\ldots+x_n$ (since $S^2/2\sim \sum_{i<j} x_ix_j$). Now if we expand $\exp\left((n+1)S+S^2/2\right)$ as a power series in $S$, we get $[x_1\ldots x_n]S^n=n!$ and $[x_1\ldots x_n]S^k=0$ for $k\ne n$, thus your identity.