First, I would recommend starting with the solution for the coefficients of the linear least squares regression:
$$A = (X^TX)^{-1}X^TY $$
Where:
$$ A =
\begin{pmatrix}
m \\
b \\
\end{pmatrix}
$$
m and b are the slope and intercept of the line, respectively.
$$
X =
\begin{pmatrix}
x_1 & 1 \\
x_2 & 1 \\
\cdots & \cdots \\
x_n & 1 \\
\end{pmatrix}
$$
$$Y =
\begin{pmatrix}
y_1 \\
y_2 \\
\cdots \\
y_n \\
\end{pmatrix}
$$
Then I would try to figure out how the coefficient matrix "A" would change for a change in one of the entries in Y, that is:
$$A' - A = (X^TX)^{-1}X^T(Y'-Y)$$
Where Y' is the original vector Y modified by a shift of some value, say $\epsilon$, in one of the entries:
$$Y' =
\begin{pmatrix}
y_1 \\
y_2 + \epsilon \\
\cdots \\
y_n \\
\end{pmatrix}
$$
For Case 1, you want to show that that 'm' doesn't change for any value epsilon added to an entry in Y' (which you should find only happens if the x value associated to that entry is the average of all the x values).
For Case 2, you know that the slope of the line will change and hence the new line will intersect the old line. So then you need to calculate the slope and intercept of both the the old and new lines by by evaluating $A = (X^TX)^{-1}X^TY$ and $A' = (X^TX)^{-1}X^TY'$ and then determine the intersection point for these two lines.
Also, as a further hint, I would recommend explicitly evaluating $(X^TX)^{-1}$ ($X^TX$ is only a 2x2 matrix so this is relatively straightforward) and $X^TY$.
It's not always the case that least squares will give you a garbage result. Let's talk about what's happening.
If you're fitting a polynomial of degree more than n-1, then you have more variables than you do coefficients. If you create the Vandermonde matrix, it's a short and fat matrix, an underdetermined system. So there are an infinite number of solutions that satisfy your constraints (fitting the polynomial through the data). One of those solutions is the solution that happens when you fit the polynomial of degree n-1 to the data. It is possible to get that solution.
How does least squares determine the solution that's chosen? That depends on the implementation of least squares. A common choice, when solving $Ax=b$ is to make the solution $x^\star = A^\dagger b$, where $A^\dagger$ represents the pseudoinverse of $A$. In this case, $x^\star$ is defined to be the solution of the following optimization problem:
\begin{align}
\text{minimize} &\hspace{1em} \|x\|_2 \\
\text{subject to} &\hspace{1em} x \in \text{proj}_{Im(A)}(b).
\end{align}
That is, choose the smallest x such that $x$ is a projection of $b$ onto the image of $A$ (the column space of $A$).
It can be the case that higher order polynomial coefficients can be smaller and still have the polynomial fit the data. When that happens, and you use that polynomial for interpolation of other points, you will have much wider variation (higher frequencies) because you have higher order polynomial coefficients.
Best Answer
Rewrite the numerator
\begin{equation} \sum_i(y_i-\bar y)(x_i-\bar x) = \sum_i(y_i-\frac{1}{n}\sum_j y_j)(x_i-\frac{1}{n}\sum_j x_j) \end{equation} that is
\begin{equation} \sum_i(y_i-\bar y)(x_i-\bar x) = \sum_i(\sum_j \frac{1}{n} y_i-\frac{1}{n}\sum_j y_j)(\sum_j \frac{1}{n}x_i-\frac{1}{n}\sum_j x_j) \end{equation} or \begin{equation} \sum_i(y_i-\bar y)(x_i-\bar x) = \sum_i\sum_j( \frac{1}{n} y_i-\frac{1}{n} y_j)(\frac{1}{n}x_i-\frac{1}{n} x_j) = \frac{1}{n} \sum_{i,\,j}(y_i-y_j)(x_i-x_j) \end{equation}
Rewrite the denominator
\begin{equation} \sum_i(x_i-\bar x)^2 = \sum_i(x_i-\bar x)(x_i-\bar x) = \sum_i(x_i-\frac{1}{n}\sum_j x_j )(x_i-\frac{1}{n}\sum_j x_j ) \end{equation} that is \begin{equation} \sum_i(x_i-\bar x)^2 = \sum_i\sum_j(\frac{1}{n}x_i-\frac{1}{n}x_j )(\frac{1}{n}x_i-\frac{1}{n} x_j ) = \frac{1}{n} \sum_i\sum_j(x_i-x_j )^2 \end{equation}
Replace now
So \begin{equation} \frac{\sum_i(y_i-\bar y)(x_i-\bar x)}{\sum_i(x_i-\bar x)^2} = \frac{\frac{1}{n} \sum_{i,\,j}(y_i-y_j)(x_i-x_j)}{\frac{1}{n} \sum_i\sum_j(x_i-x_j )^2} = \frac{\sum_{i,\,j}(y_i-y_j)(x_i-x_j)}{\sum_{i,\,j}(x_i-x_j)^2} \end{equation}