Solved – the difference between confidence intervals and confidence bands

confidence intervalterminology

On Wikipedia it mentions states that a confidence band is:

$\Pr(\hat{f}(x) – w(x) < f(x) < \hat{f}(x) + w(x), \text{ for all }x) = 1-\alpha$

Where where $\hat{f}(x)$ is the point estimate of $f(x)$.

and it states that a confidence interval is:

Let X be a random sample from a probability distribution with statistical parameters θ, which is a quantity to be estimated, and φ, representing quantities that are not of immediate interest. A confidence interval for the parameter θ, with confidence level or confidence coefficient γ, is an interval with random endpoints (u(X), v(X)), determined by the pair of random variables u(X) and v(X), with the property:

${\Pr }_{\theta ,\varphi }(u(X)<\theta <v(X))=\gamma {\text{ for all }}(\theta ,\varphi )$

Is the only difference in the point estimate vs. the random variables for the upper and lower intervals/bands? A.K.A bands are symmetric and intervals don't have to be?

What are the advantages of one over the other?

When would you use one over the other?

Thanks!

Best Answer

Actually it is very simple. For a one dimensional variable there is a confidence interval, e.g., like $\pm1.5$. For a two dimensional plot, there is a confidence band, which are functions above and below the estimates, e.g., from regression.enter image description here

In the figure above, the 95% CI, and refer to where the linear regression line is located (blue dashes). The 95% "prediction interval" actually refers to where the data points are located 95% of the time; those are the confidence bands for the data (blue-green dot dashes).

Related Question