This is slightly too long to be comment, though this is not an answer:
Edit: I am providing a few revisions, but I still have not found a proof.
My previous comment was not quite correct. I "rearranged" the inequality by raising both sides of the inequality to certain powers, and I did not take into account that we do not know a priori whether any of the terms are larger or smaller than $1$. Depending on the size of each term (in particular, whether it is smaller or larger than $1$), exponentiating both sides could potentially reverse the inequality. I have returned the question to its original form as suggested by the OP, and I have added one more observation.
Let me first say that I have been working on this question for a bit, and though I have not yet resolved it, I have been having fun trying!
Now, to emphasize the dependence on $n$, let's set
$$
\alpha_n = \sum_{i=1}^n a_i \qquad \beta_n = \sum_{i=1}^n b_i, \qquad \sigma_n = \sum_{i=1}^n c_i,
$$
where $c_i = \sqrt{a_i b_i}$. Further, let's put
$$
A_n = \prod_{i=1}^n (a_i)^{a_i}, \qquad B_n = \prod_{i=1}^n (b_i)^{b_i}, \qquad S_n = \prod_{i=1}^n (c_i)^{c_i}.
$$
Our goal is to show:
\begin{equation}
(1) \hspace{1in} \left(\frac{A_n}{(\alpha_n)^{\alpha_n}}\right)^{\frac{1}{\alpha_n}} \cdot \left(\frac{B_n}{(\beta_n)^{\beta_n}} \right)^{\frac{1}{\beta_n}} \leq \left(\frac{S_n}{(\sigma_n)^{\sigma_n}}\right)^{\frac{2\sigma_n}{\alpha_n \beta_n}}
\end{equation}
A few pedestrian observations:
If $a_i = b_i$ for $i = 1, \dots , n$ (which forces $c_i = a_i = b_i$), then $A_n = B_n = S_n$, we also have $\alpha_n = \beta_n = \sigma_n$, and (1) holds in this case.
Note that $2c_i \leq a_i + b_i$ as $2xy \leq x^2 + y^2$ for all real numbers $x, y$. Hence, $2\sigma_n \leq \alpha_n + \beta_n$. Furthermore, Cauchy-Schwarz gives $\sigma_n^2 \leq \alpha_n \beta_n$. Both of these observations imply that $(\sigma_n + 1)^2 \leq (\alpha_n + 1)(\beta_n + 1)$.
I would imagine that with enough creativity, one may find a proof of the inequality involving convexity or a simple application of the AM-GM inequality (which I suppose is much the same thing!).
I have been unable to prove the inequality even in the case $n = 2$, when I assume $\alpha_n = \beta_n = 1$. I am not hopeful for a proof of the general case.
Problem: Assume that we know the following:
$\sum p_i\log \frac{p_i}{q_i} \ge 0$ for $p_i>0 , q_i>0, \forall i$
and $\sum p_i = 1, \sum q_i = 1 $.
Prove that $\prod x_i^{a_i} \le \sum a_i x_i$
for $x_i> 0, a_i > 0, \forall i$ and $\sum a_i = 1$.
Solution: By taking logarithm on both sides, the inequality to be proved is written as
$$\sum\nolimits_i a_i\log x_i \le \log \sum\nolimits_i a_i x_i
= (\sum\nolimits_i a_i)\log \sum\nolimits_i a_i x_i
= \sum\nolimits_i a_i\log \sum\nolimits_j a_j x_j $$
or
$$\sum\nolimits_i a_i \Big(\log \sum\nolimits_j a_j x_j - \log x_i\Big)\ge 0$$
or
$$\sum\nolimits_i a_i \Big(\log a_i - \log \frac{a_ix_i}{\sum\nolimits_j a_j x_j}\Big)\ge 0.\tag{1}$$
Let
$$p_i = a_i, \quad q_i = \frac{a_ix_i}{\sum_j a_j x_j}, \quad i=1, 2, \cdots, n.$$
We know that (1) holds.
We are done.
Best Answer
Let $\phi(x) = x\log x$. This function is convex in the set $[0, 1]$.
The entropy $S$ is defined as:
$$S(p) = -\sum \phi(p_i),$$
where $p=[p_1, p_2, \ldots, p_N]$.
Then, using the Jensen's inequality, you get that:
$$ \phi \left(\frac{\sum{p_i}}{N}\right)\leq \frac{\sum{\phi (p_i)}}{N} \Rightarrow \phi \left(\frac{\sum{p_i}}{N}\right)\leq -\frac{S(p)}{N} \Rightarrow S(p) \leq -N \phi \left(\frac{\sum{p_i}}{N}\right).$$
Notice that, whichever is $p$, then by definition $\sum p_i = 1$, and hence:
$$S(p) \leq -N\phi\left(\frac{1}{N}\right) = -N\left(\frac{1}{N}\log\frac{1}{N}\right) = \log N.$$
This means that the entropy is at most equal to $\log N$.
Now, notice that $S(p) = \log(N)$ for $p = \left[\frac{1}{N}, \ldots, \frac{1}{N}\right]$.
Then: $$p_i = \frac{1}{N} \forall i \implies S(p) ~\text{is maximum}.$$
We conclude the proof by observing that the maximum $p_i = \frac{1}{N} ~\forall i$ is unique since $S(p)$ is strictly concave.