Why does Derivative Within Summation Get Rid of Summation

statisticssummation

On the second-to-last line of page 4 of this document, a mathematical step is performed that I cannot understand. This step leads to an answer that I have seen many other places, so the step is likely to be true, but I cannot understand why. (Context: the standard uncertainty for the slope of a linear least-squares fit is being derived)

The step is:
$$
\sigma^2 \sum\limits_{i=1}^n\left( \frac{\partial}{\partial y_i} \frac{n\sum\limits_{i=1}^n x_i y_i – \sum\limits_{i=1}^n x_i \sum\limits_{i=1}^ny_i}{n\sum\limits_{i=1}^nx_i^2-\left( \sum\limits_{i=1}^n x_i\right)^2} \right)^2 = \sigma^2\sum\limits_{i=1}^n \left( \frac{n x_i – \sum\limits_{j=1}^n x_j}{n\sum\limits_{i=1}^nx_i^2-\left( \sum\limits_{i=1}^n x_i\right)^2}\right)^2
$$

Why does the first term in the numerator become $n x_i$ instead of $n \sum_{i=1}^n x_i$?

Also, why does the notation for the second term in the numerator switch from index $i$ to index $j$?

UPDATE: The notation used here is hard to understand as their are nested summations over the same index ($i$). This is how we got here:

This is the expression for the slope of a linear, least-squares fit. The points to which the line is being fit are $ \{ (x_i, y_i)\}$. I believe the notation up to this point is ok:
$$
m = \frac{n\sum\limits_{i=1}^n x_i y_i – \sum\limits_{i=1}^n x_i \sum\limits_{i=1}^ny_i}{n\sum\limits_{i=1}^nx_i^2-\left( \sum\limits_{i=1}^n x_i\right)^2}.
$$

We assume that for each $y_i$, there exists an uncertainty value $\sigma_i$. We want to find the uncertainty in $m$, which I will call $\sigma_m$. To do this we use a standard propagation of uncertainty in quadrature formula:
$$
\sigma_m^2 = \sum\limits_{i=1}^n \sigma_i^2 \left( \frac{\partial m}{\partial y_i}\right)^2.
$$

My question really boils down to, can someone please help me to find $\sigma_m^2$ given these two equations?

Best Answer

It is "easy" once you write out the few terms in the series.

Your example, we are taking the derivative w.r.t to $y_j$.

Thus the only part of the summation we need to consider is the numerator in the expression.

$$ n\sum_j x_jy_j - \sum_i x_j \sum_jy_j $$

this is basically $$ n\left(x_1y_1 + x_2y_2 + \cdots x_ny_n\right) - \left(x_1 + x_2+ \cdots x_n\right) \cdot \left(y_1 + y_2 + \cdots y_n\right) $$ so lets take the derivative with the first index $y_1$ $$ n\frac{\partial}{\partial y_1}\left(x_1y_1 + x_2y_2 + \cdots x_ny_n\right) - \\ \frac{\partial}{\partial y_1}\left(x_1 + x_2+ \cdots x_n\right)\left(y_1 + y_2 + \cdots y_n\right) \\= \frac{\partial}{\partial y_1}(x_1y_1) + \frac{\partial}{\partial y_1}(x_2y_2) + \cdots \frac{\partial}{\partial y_1}(x_ny_n) -\\ \left(x_1 + x_2+ \cdots x_n\right)\frac{\partial}{\partial y_1}\left(y_1 + y_2 + \cdots y_n\right) $$ as we can see the only terms that survive are $$ nx_1\cdot \frac{\partial}{\partial y_1}y_1 - \left(x_1 + x_2+ \cdots x_n\right)\cdot \frac{\partial}{\partial y_1}y_1 = x_1 - \left(x_1 + x_2+ \cdots x_n\right) = x_1 - \sum_j x_j $$ with all other terms going to zero.

Rinse and repeat and you will see the pattern, that the only terms that survive are $$ x_j - \sum_j x_j $$

Related Question