Solved – Linear Regression and Almost Sure Convergence

asymptoticsconvergenceregression

Consider a linear regression model, wherein:
$$
y_{i}=x_{i}\beta+\epsilon_{i}
$$
where notation is standard and $x$ is a scalar. Let us further impose
the following restriction:
$$
\epsilon_{i}|x_{i}\sim N(0,\sigma^{2})
$$

Given mean independence, the OLS estimator $\hat{\beta}_{OLS}$ is
both consistent and unbiased. Invoking the concept of almost sure
convergence, we have that:
$$
\Pr\left(\lim_{n\rightarrow\infty}\hat{\beta}-\beta=0\right)=1
$$

My question is as follows: $\hat{\beta}$ is distributed exactly
as normal, given our normality assumption of the error terms. The
normal distribution is continuous, and as a result, the probability
of the random variable $\hat{\beta}$ taking on a value of $\beta$
is exactly 0. How can/do I interpret convergence concepts in terms
of continuous distributions? Does the above imply that the distribution
of $\hat{\beta}$ becomes degenerate over time?

Best Answer

An equivalent but alternative way to say that: $Pr(\hat{\beta}−\beta=0)=1$ as $n \to \infty$ Is that: $Var(\hat{\beta}) \to 0$ as $n \to \infty$

You can see the implications of this on the distribution of $\hat{\beta}$. As the sample size increases and approaches the population, the sample essentially becomes "less random". This increases the probability of the estimated parameter being equal to the actual parameter. That is the normal distribution keeps on getting "thinner and taller" as the sample size increases. At the limit of $n$, $\hat{\beta}$ is no longer random and its distribution does indeed become degenerate. Perhaps where you get mixed up is ignoring that the as we increase the sample size the distribution changes.

Note that the answer is in rather non technical terms. I hope that someone can provide a more technical explanation of it.

Related Question