This is slightly too long to be comment, though this is not an answer:
Edit: I am providing a few revisions, but I still have not found a proof.
My previous comment was not quite correct. I "rearranged" the inequality by raising both sides of the inequality to certain powers, and I did not take into account that we do not know a priori whether any of the terms are larger or smaller than $1$. Depending on the size of each term (in particular, whether it is smaller or larger than $1$), exponentiating both sides could potentially reverse the inequality. I have returned the question to its original form as suggested by the OP, and I have added one more observation.
Let me first say that I have been working on this question for a bit, and though I have not yet resolved it, I have been having fun trying!
Now, to emphasize the dependence on $n$, let's set
$$
\alpha_n = \sum_{i=1}^n a_i \qquad \beta_n = \sum_{i=1}^n b_i, \qquad \sigma_n = \sum_{i=1}^n c_i,
$$
where $c_i = \sqrt{a_i b_i}$. Further, let's put
$$
A_n = \prod_{i=1}^n (a_i)^{a_i}, \qquad B_n = \prod_{i=1}^n (b_i)^{b_i}, \qquad S_n = \prod_{i=1}^n (c_i)^{c_i}.
$$
Our goal is to show:
\begin{equation}
(1) \hspace{1in} \left(\frac{A_n}{(\alpha_n)^{\alpha_n}}\right)^{\frac{1}{\alpha_n}} \cdot \left(\frac{B_n}{(\beta_n)^{\beta_n}} \right)^{\frac{1}{\beta_n}} \leq \left(\frac{S_n}{(\sigma_n)^{\sigma_n}}\right)^{\frac{2\sigma_n}{\alpha_n \beta_n}}
\end{equation}
A few pedestrian observations:
If $a_i = b_i$ for $i = 1, \dots , n$ (which forces $c_i = a_i = b_i$), then $A_n = B_n = S_n$, we also have $\alpha_n = \beta_n = \sigma_n$, and (1) holds in this case.
Note that $2c_i \leq a_i + b_i$ as $2xy \leq x^2 + y^2$ for all real numbers $x, y$. Hence, $2\sigma_n \leq \alpha_n + \beta_n$. Furthermore, Cauchy-Schwarz gives $\sigma_n^2 \leq \alpha_n \beta_n$. Both of these observations imply that $(\sigma_n + 1)^2 \leq (\alpha_n + 1)(\beta_n + 1)$.
I would imagine that with enough creativity, one may find a proof of the inequality involving convexity or a simple application of the AM-GM inequality (which I suppose is much the same thing!).
I have been unable to prove the inequality even in the case $n = 2$, when I assume $\alpha_n = \beta_n = 1$. I am not hopeful for a proof of the general case.
The excellent book The Cauchy-Schwarz Master Class has already been mentioned in the comments by Theo.
Since I cannot stand open questions where a lot of people know the answer to I'll just summarize what is in chapter 9 of the referred book.
You're right that you can prove it the way you do but usually when people take a course in measure and integration theory first Hölder's inequality is proven and then Minkowski's inequality. In that way it can be instructive to use Hölder's inequality to prove Minkowksi's inequality.
There is another advantage to this approach. We can quite easily deduce from the proof when equality arises. To see this let me quickly recall how the proof goes (this can be found in the book by Steele).
First write by using the triangle inequality
$$\sum_{k = 1}^n |x_k + y_k|^p \leq \sum_{k = 1}^n |x_k||x_k + y_k|^{p - 1} + \sum_{k = 1}^n |x_k||x_k + y_k|^{p - 1}.$$
So now we can assume $p > 1$ otherwise we are done. We can now apply Hölder to both of the terms on the right hand side so we find
$$\sum_{k = 1}^n |x_k||x_k + y_k|^{p - 1} \leq \left (\sum_{k = 1} |x_k|^p \right )^{1/p} \left (\sum_{k = 1}^n |x_k + y_k|^{p} \right )^{(p - 1)/p}$$
and
$$\sum_{k = 1}^n |y_k||x_k + y_k|^{p - 1} \leq \left (\sum_{k = 1} |y_k|^p \right )^{1/p} \left (\sum_{k = 1}^n |x_k + y_k|^{p} \right )^{(p - 1)/p}$$
Now we can assume that the left hand side of the first inequality is non-zero so we can divide by $\displaystyle \left (\sum_{k = 1}^n |x_k + y_k|^{p} \right )^{(p - 1)/p}$ to obtain the proof.
Fine. So now if we would have equality in Minkowski's inequality the first inequality written here would also be an equality. This implies $|x_k + y_k| = |x_k| + |y_k|$ for all $1 \leq k \leq n$. Thinking for a bit we can conclude that $x_k$ and $y_k$ must be of the same sign for all $k$. Actually, there is no problem to assume $x_k, y_k \geq 0$ because we can factor the - out and it gets lost in the absolute value.
But equality in Minkowksi's inequality also means that we have equality in the two lines where Hölder's inequality is used. Now you can recall what it means to have equality in Hölder's inequality. We have that there exists $\lambda, \lambda' \geq 1$ such that
$$\lambda |x_k|^p = (|x_k + y_k|^{p - 1})^q = |x_k + y_k|^p \text{ and } \lambda' |y_k|^p = (|x_k + y_k|^{p - 1})^q = |x_k + y_k|^p.$$
Dividing both equalities we get that $\frac{\lambda}{\lambda'} |x_k|^p = |y_k|^p$. So this proof can be easily backtraced.
But again, credit must be given where credit is due: This is just what is written in Steele in my own words. I don't think I'm plagiarizing because this method can be considered to be common knowledge.
So check out the book in the library or buy it, it is quite cheap for a math book and it contains fun exercises.
Best Answer
Holder's Inequality would say that $$\sum |x_ky_k| \leq \left(\sum |x_k|^r\right)^{1/r}\left(\sum|y_k|^s\right)^{1/s}$$ where $\frac{1}{r}+\frac{1}{s}=1$.
Apply Holder's twice, once to each sum, using $x_k = a_k$, $y_k = (|a_k|+|b_k|)^{p-1}$ in one, and similarly in the other, with $r=p$ and $\frac{1}{s}=1-\frac{1}{p}$.