This is slightly too long to be comment, though this is not an answer:
Edit: I am providing a few revisions, but I still have not found a proof.
My previous comment was not quite correct. I "rearranged" the inequality by raising both sides of the inequality to certain powers, and I did not take into account that we do not know a priori whether any of the terms are larger or smaller than $1$. Depending on the size of each term (in particular, whether it is smaller or larger than $1$), exponentiating both sides could potentially reverse the inequality. I have returned the question to its original form as suggested by the OP, and I have added one more observation.
Let me first say that I have been working on this question for a bit, and though I have not yet resolved it, I have been having fun trying!
Now, to emphasize the dependence on $n$, let's set
$$
\alpha_n = \sum_{i=1}^n a_i \qquad \beta_n = \sum_{i=1}^n b_i, \qquad \sigma_n = \sum_{i=1}^n c_i,
$$
where $c_i = \sqrt{a_i b_i}$. Further, let's put
$$
A_n = \prod_{i=1}^n (a_i)^{a_i}, \qquad B_n = \prod_{i=1}^n (b_i)^{b_i}, \qquad S_n = \prod_{i=1}^n (c_i)^{c_i}.
$$
Our goal is to show:
\begin{equation}
(1) \hspace{1in} \left(\frac{A_n}{(\alpha_n)^{\alpha_n}}\right)^{\frac{1}{\alpha_n}} \cdot \left(\frac{B_n}{(\beta_n)^{\beta_n}} \right)^{\frac{1}{\beta_n}} \leq \left(\frac{S_n}{(\sigma_n)^{\sigma_n}}\right)^{\frac{2\sigma_n}{\alpha_n \beta_n}}
\end{equation}
A few pedestrian observations:
If $a_i = b_i$ for $i = 1, \dots , n$ (which forces $c_i = a_i = b_i$), then $A_n = B_n = S_n$, we also have $\alpha_n = \beta_n = \sigma_n$, and (1) holds in this case.
Note that $2c_i \leq a_i + b_i$ as $2xy \leq x^2 + y^2$ for all real numbers $x, y$. Hence, $2\sigma_n \leq \alpha_n + \beta_n$. Furthermore, Cauchy-Schwarz gives $\sigma_n^2 \leq \alpha_n \beta_n$. Both of these observations imply that $(\sigma_n + 1)^2 \leq (\alpha_n + 1)(\beta_n + 1)$.
I would imagine that with enough creativity, one may find a proof of the inequality involving convexity or a simple application of the AM-GM inequality (which I suppose is much the same thing!).
I have been unable to prove the inequality even in the case $n = 2$, when I assume $\alpha_n = \beta_n = 1$. I am not hopeful for a proof of the general case.
You can also write
$$H(X,Y) = H(X) +H(Y|X) = H(Y) +H(X|Y) $$
Now, $H(Y|X) = \sum_x P(X=x) H(Y |X=x)$ . But $H(Y |X=x)=0$ , because for any given $X=x$, $Y$ is deterministic, i.e. $P(Y| X=x)$ equals $1$ for some $y$, zero elswhere.
Hence $H(Y|X)=0$ , and $$H(Y) = H(X)-H(X|Y) \le H(X)$$
because $H(X|Y)\ge 0$.
Equality occurs if $H(X|Y)=0$, that is, if knowing $g(X)$ one knows $X$; that is, if $g(X)$ is one-to-one.
Best Answer
Let $x$ be the new sample. By the definition of the empirical distribution,
$P_{n+1} = \frac{n}{n+1} P_n + \frac{1}{n+1} \delta_x,$ where $\delta_x$ is the unit mass at $x$.
Since $x \not\in \mathrm{support}(P_n), $ this is a disjoint mixture of two distributions. As an exercise show that if $P = \lambda Q + (1-\lambda) R$ for $\lambda \in [0,1]$ and distributions $Q,R$ with disjoint support, then $$ H(P) = h_2(\lambda) + \lambda H(Q) + (1-\lambda) H(R),$$ where $h_2$ is binary entropy.
(This is ex. 2.10 in the 2nd edition of Cover and Thomas)
Now, we have $$ H(P_{n+1}) = h_2\left(\frac{1}{n+1}\right) + \frac{n}{n+1} H(P_n) + 0.$$
So the question becomes if it holds that $$\frac{H(P_n)}{n+1} \overset{?}< h_2\left(\frac{1}{n+1}\right)$$
To show this, note first that since the support of $P_n$ has at most $n$ symbols, $H(P_n) \le \log n$. But then $$h_2\left(\frac{1}{n+1}\right) = \frac{\log (n+1)}{n+1} + \frac{n}{n+1}\log \frac{n+1}{n} \ge \frac{\log n+1}{n+1} > \frac{H(P_n)}{n+1}. $$