Statistics – Difference Between Margin of Error and Standard Error

definitionstandard error

Is "margin of error" the same as "standard error"?

A (simple) example to illustrate the difference would be great!

Best Answer

Short answer: they differ by a quantile of the reference (usually, the standard normal) distribution.

Long answer: you are estimating a certain population parameter (say, proportion of people with red hair; it may be something far more complicated, from say a logistic regression parameter to the 75th percentile of the gain in achievement scores to whatever). You collect your data, you run your estimation procedure, and the very first thing you look at is the point estimate, the quantity that approximates what you want to learn about your population (the sample proportion of redheads is 7%). Since this is a sample statistic, it is a random variable. As a random variable, it has a (sampling) distribution that can be characterized by mean, variance, distribution function, etc. While the point estimate is your best guess regarding the population parameter, the standard error is your best guess regarding the standard deviation of your estimator (or, in some cases, the square root of the mean squared error, MSE = bias$^2$ + variance).

For a sample of size $n=1000$, the standard error of your proportion estimate is $\sqrt{0.07\cdot0.93/1000}$ $=0.0081$. The margin of error is the half-width of the associated confidence interval, so for the 95% confidence level, you would have $z_{0.975}=1.96$ resulting in a margin of error $0.0081\cdot1.96=0.0158$.