[Math] what’s the relationship with log(sum) and sum(log)

logarithms

hi I'm a little confused about the log(sum) function and sum(log) function. In special, what's the relationship between these two terms?
$$
-\log \sum_{i}a_i\sum_i b_i
$$

$$
-\sum_i\log(a_i+b_i)
$$


thanks for the comment from @hardmath. here is the original question:

given a negative log-likelihood of an observation set:
$$
\mathbf{L}=-\sum_{i,j}\log(\pi_a M_{i,j}+\pi_bN_{i,j})
$$
where C is the constant parameter. $\pi_a$+$\pi_b$=1 are proportion of the two component, given the instance $O_{ij}$.

$\bf{Lemma 1}$
$$
-\log\sum_{k=1}^Kf_k(x)=\min_{\Phi(x)\in \Delta_+}\sum_{k=1}^K\{\Phi_k(x)[\log\Phi_k(x)-log(f_k(x)]\} \\
s.t. \sum\Phi_k(x)=1, \Phi_k(x)\in (0,1)
$$

$\bf{proof}$
$$
RHS=\sum_{k=1}^K\Phi_k(x)\log\frac{\Phi_k(x)}{f_k(x)} \\
>=\sum_{k=1}^K\Phi_k(x)\log\frac{\sum_{k=1}^K\Phi_k(x)}{\sum_{k=1}^Kf_k(x)}(log-sum\ inequality) \\
=-\log\sum_{k=1}^Kf_k(x)(\sum\Phi_k(x)=1)
$$

Let:
$$
\mathit{C}=\sum_{i,j}\Phi^{i,j}_a(\log\Phi^{i,j}_a-\log(\pi_aM_{i,j}))+\Phi^{i,j}_b(\log\Phi^{i,j}_B-\log(\pi_bN_{i,j}))
$$
given the constraint, that for each $(i,j)$, $\Phi^{i,j}_a+\Phi^{i,j}_b=1$

Then $\textbf{how to prove:}$

Minimize $C$ equals minimize $L$ ?

following lemma1, we have
$$
\min C=-\log\sum(\pi_aM_{i,j})-\log\sum(\pi_bN_{i,j})
$$
then the next step is how to prove the relationship between $\min C$ and $L$?

Best Answer

A comment about your log-sum inequality: it's just Jensen in disguise. Maybe you can do something similar for other weighted sums of logs.

For $\sum_k a_k=1$, $a_k,b_k$ positive, WTS $$ \sum_k a_k\log(a_k/b_k)\geq\sum_k a_k\log(1/B) $$ where $B=\sum_k b_k$. Subtract the LHS above from the RHS: $$ \sum_k a_k\log(1/B)-\sum_k a_k\log(a_k/b_k)=\sum_ka_k\log\left(\frac{b_k}{a_kB}\right). $$ Because $\log$ is concave, Jensen's inequality says $$ \sum_ka_k\log\left(\frac{b_k}{a_kB}\right)\leq\log\left(\sum_k a_k\frac{b_k}{a_kB}\right)=\log(\sum_kb_k/B)=\log(1)=0. $$

Related Question